The data-privacy watchdog has written to a Facebook whistleblower, requesting her full evidence to see whether the technology company has broken UK law.
Elizabeth Denham, the information commissioner, has stated that she intends to examine the documents from a UK perspective, particularly those relating to children.
Former Facebook employee Frances Haugen claimed the social media company kept information about how it used data “behind walls.”
However, Facebook founder Mark Zuckerberg dismissed her claims.
“Most of us simply don’t recognise the distorted image of the company that is being painted,” he explained.
Ms Denham, who is stepping down next month, told BBC News: “We’re looking very closely about what is publicly available right now from Frances’s testimony – but I’ve also written to her to ask for access to the full reports of her allegations.
- Facebook harms children, claims ex-employee
- Whistleblower to appear before UK Parliament committee
- Facebook whistleblower reveals identity
“Because what I want to do with that information is analyse it from the perspective of the United Kingdom – are these harms applicable in the United Kingdom, particularly through the lens of children?” We have implemented a new children’s code that specifies design considerations for protecting children online.
“I want to see if these allegations point to any violations of UK law before taking action.”
According to the whistleblower, Facebook’s products could endanger children’s mental health and exacerbate societal divisions.
“Facebook’s closed design means it has no real oversight,” said Ms Haugen, a former employee who worked on the company’s algorithmic products, to a US Senate committee.
“Only Facebook knows how it customises your feed for you.”
“Facebook conceals itself behind barriers that prevent researchers and regulators from understanding the true dynamics of their system.”
Ms Haugen will testify before the UK Parliament’s online safety bill committee on October 25.
Elizabeth Denham has done far more than most to limit the power of big tech. But she is concerned about the power disparity between democracies and Silicon Valley, as well as efforts to politicise the regulator she is leaving.
For the past five years, stories about the harms caused by social media behemoths have followed a monotonous and predictable pattern.
First, a scandal breaks out. Then comes the cry: “Something Must Be Done!” (See also: We Will Not Put Up With This!)
Then, in response to headlines and noise on (ironically) social media, a meeting is called, to which tech titan ambassadors are summoned.
Finally… nothing much happens.
This pattern has been observed numerous times.
And a recurring theme in these stories is how big societal harm frequently falls between regulators, with Ofcom, the Advertising Standards Authority, and the Competition and Markets Authority all sticking to their remits understandably.
Elizabeth Denham, the outgoing information commissioner, has been something of an antidote to this trend.