Whistleblower Frances Haugen tells UK lawmakers Facebook is making hate online worse


LONDON – Frances Haugen, Facebook whistleblower Told UK lawmakers on Monday that the social media giant stoked hatred and extremism online, failed to protect children from harmful content and had no incentive to address the issues, thus giving a boost to government efforts Europeans working on stricter regulation of technology companies.

Although her testimony echoes much of what she told the US Senate This month, his in-person appearance sparked great interest from a UK parliamentary committee that is much further along in drafting legislation to curb the power of social media companies.

It happens the same day as Facebook FB,
published its latest results and that the Associated Press and other news organizations started posting stories based on thousands of pages of internal company documents she obtained.

Also see: Facebook profits exceed $ 9 billion, but Apple’s change puts sales in the spotlight

Haugen told the UK lawmakers’ committee that Facebook groups are amplifying hatred online, saying algorithms that prioritize engagement take people with dominant interests and push them to the extreme. The former Facebook data scientist said the company could add moderators to prevent groups over a certain size from being used to spread extremist views.

“Undoubtedly, it makes the hatred worse,” she said.

Haugen said she was “shocked to hear recently that Facebook wants to double the metaverse and that they’re going to hire 10,000 engineers in Europe to work on the metaverse, ”Haugen said, referring to the company’s plans for an immersive online world which she says will be the next big internet trend.

“I was like, ‘Wow, do you know what we could have done in terms of safety if we had had 10,000 more engineers? »», She declared.

Facebook says it wants regulation for tech companies and was happy the UK was leading the way.

“While we have rules against harmful content and publish regular transparency reports, we agree that we need regulation for the entire industry so that companies like ours don’t make these decisions by ourselves, ”Facebook said Monday.

He pointed out that he had invested $ 13 billion (£ 9.4 billion) in safety and security since 2016 and claimed he had “almost halved” the number of hate speech over the years. last three quarters.

Haugen accused Facebook-owned Instagram of failing to prevent children under 13 – the minimum age for use – from opening accounts, saying he was not doing enough to protect children from content which, for example, makes them feel bad about their body.

“Facebook’s own research describes it as an addict’s story. The children say, “It makes me miserable, I feel like I don’t have the ability to control my use and I feel like if I left I would be ostracized,” she said. declared.

Last month, the company delayed plans for a children’s version of Instagram, targeted at under-13s, to address concerns about the vulnerability of younger users. Haugen said she was concerned that it might not be possible to make Instagram safe for a 14-year-old and that “I sincerely doubt that it is possible to make it safe for a 10-year-old.”

She also said Facebook’s moderation systems are less good at capturing content in languages ​​other than English, and that’s a problem even in the UK because it’s a diverse country.

“These people also live in the UK and are fueled by dangerous disinformation, which radicalizes people,” Haugen said. “And so language coverage isn’t just good for individuals, it’s a matter of national security.”

Pressed to find out if she thinks Facebook is fundamentally bad, Haugen objected and said, “I can’t see men’s hearts.” Facebook is not mean, but careless, she suggested.

“She believes in a world of flatness and she will not accept the consequences of her actions”, highlighting her gigantic head office at one level and decompartmentalized as an embodiment of philosophy.

She argued that there is a culture at Facebook that discourages grassroots employees from bringing their concerns to senior management. For many of them, including CEO Mark Zuckerberg, this is the only place they’ve worked, which contributes to the cultural issue, she said.

It was Haugen’s second appearance before lawmakers after she testified in the United States on the danger that, according to her, the company represents, whether it involves harming children, inciting political violence and fueling disinformation. Haugen cited internal research documents that she secretly copied before quitting her job in Facebook’s civic integrity unit.

The documents, which Haugen provided to the U.S. Securities and Exchange Commission, allege that Facebook has prioritized profits over security and withheld its own research from investors and the public. Some file-based stories have already been published, exposing the internal turmoil after Facebook was taken aback by the January 6 riot at the United States Capitol and how he hesitated about limiting content that divides in India, and more is to come.

Representatives from Facebook and other social media companies plan to address the UK committee on Thursday.

UK lawmakers are drafting an online safety bill calling for a regulator to be put in place that would hold companies to account when it comes to removing harmful or illegal content from their platforms, such as terrorist material or images of child sexual abuse.

“It’s a moment, much like Cambridge Analytica, but maybe bigger in that I think it offers a real window into the souls of these companies,” said Damian Collins, the presiding lawmaker. the committee, before the hearing.

He was referring to the 2018 debacle involving data mining company Cambridge Analytica, which gathered details of as many as 87 million Facebook users without their permission.

Haugen is due to meet with European Union officials in Brussels next month, where the bloc’s executive board is updating its digital regulations to better protect internet users by holding online businesses more accountable for illegal or dangerous content.

Under UK rules, which are expected to come into effect next year, the Silicon Valley giants face a penalty of up to 10% of their global revenues for any breach. The EU is proposing a similar sanction.

Source link


Leave A Reply