As Frances Haugen took her seat in an opulent Victorian-style room in London’s Houses of Parliament on Monday, she remarked that the surroundings were much grander than those in Washington, where the Facebook whistleblower had given evidence to Senators two weeks earlier.
One member of the panel of British lawmakers whom she was there to testify before responded that the small room, decorated with ruby-red wallpaper and grand paintings, had been chosen for precisely that purpose: COVID-19 restrictions meant the hearing was closed off to the general public and most journalists—so there was no need to hold it in the drab office building across the road, where evidence sessions usually accommodate a larger audience.
[time-brightcove not-tgx=”true”]
Haugen’s stop in London was the first on an extensive European tour planned for November, in which the whistleblower will meet lawmakers from across the continent who are drafting laws to place new restrictions on Facebook and other Big Tech companies. The effort is part of a push by Haugen and her team to turn up the heat on Facebook in jurisdictions that historically have been faster—and more willing—to regulate the impact that American tech giants have on people’s lives and the societies they live in. Efforts to rein in Facebook’s algorithms in the U.S. have hit roadblocks of partisan disagreement over alleged censorship, even as Haugen’s disclosures have brought Republicans and Democrats closer to agreement on the need to regulate the company.
Read more: How the E.U’s Sweeping New Regulations Against Big Tech Could Have an Impact Beyond Europe
“Mark Zuckerberg has unilateral control over 3 billion people,” Haugen said of Facebook’s CEO as the hearing got underway. “There’s no will at the top to make sure these systems are run in an adequately safe way. And I think until we bring in a counterweight, things will be operated for the shareholders’ interest and not for the public interest.”
The Facebook whistleblower was in London on the same day that an avalanche of new stories were published by more than a dozen publications—based on internal Facebook documents that she had shared with them—weeks after an initial batch was published by the Wall Street Journal. The “Facebook Papers” stories published on Monday revealed further details about how the company fails to moderate harmful content in developing countries, how it stumbled ahead of the 2020 U.S. election, and how it was aware of the fact that Filipina maids were being abused and sold on the platform, but did little to act.
In a quarterly earnings call on Monday, in which the company disclosed increased year-on-year profits, Zuckerberg called the stories “a coordinated effort to selectively use leaked documents to paint a false picture of our company.” Facebook spokespeople have said the company welcomes government regulation and that it spends more on safety work than its competitors.
Read more: The 5 Most Important Revelations From the ‘Facebook Papers’
As the “Facebook Papers” stories added to mounting evidence that Facebook systemically chose profit over safety time and time again, Haugen was embarking on the next stage of her project: making sure new laws around the world reflect the internal reality of how giant social media companies work.
The lawmakers quizzing Haugen in London were part of a multi-party committee scrutinizing the U.K.’s new wide-ranging Online Safety Bill, which, if passed, would require tech companies to prevent “online harms” or face stricter penalties.
Haugen spoke in support of the bill at the hearing. “I am incredibly excited and proud of the U.K. for taking such a world-leading stance with regard to thinking about regulating social platforms,” Haugen told the lawmakers. “The Global South currently does not have the resources to stand up and save their own lives. They are excluded from these discussions.” she claimed. “The U.K. has a tradition of leading policy in ways that are followed around the world. I can’t imagine Mark [Zuckerberg] isn’t paying attention to what you’re doing. This is a critical moment for the U.K. to stand up and make sure these platforms are in the public good, and are designed for safety.”
As Haugen turned up the metaphorical heat on Facebook during her testimony, one side effect of the decision to hold the hearing in an old, small room with no air conditioning quickly became apparent: the actual air temperature shot up, too. During a break in the proceedings, officials cracked open two old windows, to little effect. As she stepped out for a breather, Haugen remarked that the committee room in Washington had at least been much cooler.
But as the hearing drew on, it became clear that there were significant differences between the type of regulation that Haugen proposes, and the type of regulation that would come into force if the U.K. bill were passed in its current form.
For starters, Haugen’s central recommendation is for regulation to focus on the algorithmic amplification systems that, according to internal Facebook research, systemically boosts divisive and polarizing content. But the draft U.K. bill is largely focused on content, not algorithms.
Speaking to TIME after the hearing, Damian Collins, the chair of the committee scrutinizing the bill, who has the power to suggest amendments, suggested that he would likely try to change the bill to focus more on systemic algorithmic harms, in light of evidence from Haugen and another former Facebook insider and whistleblower, Sophie Zhang, who testified to his committee last week.
“The analysis of the [algorithmic] systems is completely integral to all of this. This can’t just be about content moderation,” Collins told TIME. “That is something we’re looking at.”
Collins noted that the U.K.’s online safety draft bill would give powers to a national watchdog that could, if it wanted to, effectively subpoena Facebook for evidence about how its algorithms work, but that his committee was contemplating whether to change the wording of the law to recommend specific requests that it could make.
Read more: Damian Collins Is Leading Britain’s War Against Facebook
Haugen also said in her testimony that an ideal piece of regulation, in her mind, would force Facebook to disclose which safety measures it has at its disposal, in which languages they are available, and how successful they are at doing their work, broken down by language. Doing so, she suggested, would expose the inequities in Facebook’s global safety apparatus, and increase the incentive for it to fix its systems.
Her proposal seemed to have an effect on the lawmakers with the power to enact change to Big Tech legislation. “I think that is a really good suggestion,” Collins said. “This is not just a bill about content.”