Select Page

The UK Information Commissioner (ICO), has called for greater UK regulatory powers and a code of conduct to better clamp down on fake news, data misuse amid concerns about the role platforms like Facebook play in democracy.

On Tuesday (6 November), regulator Elizabeth Denham gave evidence to parliament reflecting on what she thought was a "disturbing disregard for voters’ personal privacy" from social media platforms, political parties, data brokers and credit reference firms.

Speaking at the department of culture, media and sport (DCMS) committee’s disinformation and ‘fake news’ panel, which was chaired by MP Damian Collins, Denham open up on the ICO's investigation into the use of data analytics in political campaigns.

At the centre of the investigation was pro-Brexit lobbying group Leave.EU. Though the organisation had been linked to the Facebook and Cambridge Analytica controversy, the main thread of ICO's report was focused instead on a "serious" breach of online direct marketing laws for which Leave.EU has now been fined a total of £135,000 along with Arron Banks’ firm Eldon Insurance.

Reflecting on the "unprecedented" case which opened May 2017 after the UK European Referendum, Denham said: “We had little idea of what was to come.”

“We may never know whether individuals were unknowingly influenced to vote a certain way in either the UK EU referendum or the in US election campaigns. But we do know that personal privacy rights have been compromised by a number of players and that the digital electoral ecosystem needs reform.”

She added: “But what’s at stake is the fundamentals of our democratic processes. People have to be able to trust the systems, so it’s important that we get to the bottom of this.

Online platforms must take 'more responsibility'

Denham’s investigation had identified 71 witnesses of interest, reviewed the practices of 30 organisations and is currently working through 700 terabytes – the equivalent of 52 billion pages – of data. Ukip, Cambridge Analytica’s Alexander Nix and Cambridge University’s Aleksandr Kogan all refused to talk to the ICO under caution.

The body's current inability to compel interviews from suspects has “frustrated" the investigation.

At the core of the investigation was Facebook. Its chief executive Mark Zuckerberg has turned down requests to attend the committee but Denham said that it would be “very useful” for him to attend so the regulator is dealing with those at the top of the company, rather than its regional leadership.

She urged platforms like Facebook to "take much more responsibility" but acknowledged that it had acted voluntarily to better police political ads (a feature that has already been abused by Vice and Business Insider). More broadly, she said: “They should be subject to stricter regulation and oversight”.

Facebook’s defence has been pinned on the fact it holds inferred data about users based on their preferences and habits.

However, Denham said: “If you’re targeting people on the basis of inferred data, that is personal data”. In particular, she outlined: “The use of lookalike audiences should be made transparent to individuals. They need to know that a political party is making use of lookalike audiences.”

Denham also reflected on Facebook’s recent £500,000 fine from the body - the maximum afforded.

“We also found evidence that as recently as 2018, spring, some of [Facebook's] data was still there at Cambridge Analytica. So there’s evidence that the follow-up was less than robust, which is part of the reason we fined Facebook.”

The social network also faces a larger fine from the European Union (EU) which could number $1.63bn.

With these, and more, policy and regulatory issues on the horizon, the social media giant recently hired former UK deputy prime minister Nick Clegg as head of global affairs and communications.

Clegg has been tasked with helping ensure the site is "working with people, organisations, governments and regulators around the world to ensure that technology is a force for good.

The Drum has reached out to Facebook for comment on ICO's report and recommendations, at the time of writing it had yet to respond.

'Time for self-regulation' is over

To tech companies, Denham said: “The time for self-regulation is over.”

Citing dis-and-misinformation, harm to children and more, it was her opinion that the UK parliament needs sets objectives and outcomes for tech companies to follow, including a code of practice developed by a regulator.

This body must be active in its scrutiny of takedowns, bots and the policing of data at these companies, she said.

ICO also called for a Code of Practice for use of personal data in campaigns and elections, enshrined in law and urged the government to consider whether the regulatory bodies were currently fit for purpose, proposing a “hybrid model between Ofcom and the ICO” that would have to balance freedom of expression with internet harms is difficult”.

Meanwhile, leading back to the catalyst of the investigation, Denham reflected on Leave.EU. It was found to have breached data laws by pooling data across founder Arron Banks' Eldon Insurance. The investigation found that more than 1m unsolicited emails were sent to Leave.EU subscribers promoting Banks’ business.

She outlined that there were “concerns of misuse of personal data,” adding that “the processes must be able to separate data from a political campaign from the insurance campaign”.

Data had reportedly been shared both ways between Banks’ company and the Leave.EU.

Responding to the fine, Banks tweeted "we may have accidentally sent a newsletter to customers" but added that there was "no evidence of a grand data conspiracy".

Denham’s priorities for the government was to develop a code that is “backed with the statute, extraterritorial reach, sanction – the powers the ICO has, those are the powers that a regulator needs to look at content and conduct online.”

She concluded: “This is not the end of our work, you can see there are several strands that will take us into the future.”

The full ICO report can be read here.