Former Mark Zuckerberg employee denounces the company: 'Facebook is tearing up our society'

Frances Haugen, a former Facebook employee, shared why she decided to reveal the inside story of the social networking company she used to work for.

Frances Haugen, a former Facebook employee
Frances Haugen, a former Facebook employee

Frances Haugen, a former Facebook employee who worked in a role helping the world's largest social networking company fight misinformation. However, recently, she decided to stand up against the company, when sharing the dark secrets of this platform in a news program 60 Minutes of CBS (USA).

Collectively, she says, the tech company is prioritizing profit over safety, and is itself "dividing our society".

Frances Haugen used to be a Facebook employee.
Frances Haugen used to be a Facebook employee.

Choose profit over community benefit

“What I see at Facebook many times is a conflict of interest between what is good for the public and what is good for Facebook. And Facebook, time and time again, has chosen to optimize for its own benefit, like making more money," Haugen said.

She also accused Facebook of endangering public safety by reversing changes to its own algorithm, after the 2020 presidential election ended, to allow misinformation to spread. on this platform again.

"As soon as the election ended, they turned them off [safety systems] or changed the settings back to the way they were before, to prioritize growth over safety. And that really makes me feel seen as a betrayal of democracy".

Compare Facebook with other platforms

In 15 years of working as a technology professional, Haugen, now 37, has worked for other major players in the industry such as Google and Pinterest. But she said Facebook has a "worst" approach to restricting harmful content.

"I've looked through a bunch of social networks and it's actually worse at Facebook, than anything I've seen before," she said.

Referring to Mark Zuckerberg, the founder and CEO of Facebook, she commented: "I have a lot of sympathy for Mark. And Mark never started creating a hate platform. But, he did. It allows choices to be made when the side effect of those choices is hateful, polarizing content to be more distributed and more accessible.

Instagram
 Instagram

Instagram and mental health

Recent studies show that Facebook's Instagram app is taking a toll on the mental and physical health of some teen users, with 32% of teenage girls feeling that the platform causes dissatisfaction. their body gratification gets worse.

And Haugen admits that is a fact.

"What's super tragic is that Facebook's own research shows that, as these young women begin to consume these kinds of dysfunctional content, they become more and more depressed. And it actually drives them to use the app more. So the end of this feedback cycle is that they hate their bodies more and more," she said. "Facebook's own research shows that not only is Instagram dangerous for teens, it's harmful for teens, but it's clearly worse than other forms of social media."

Facebook responded to the report's claims as a "misrepresentation" of its research.

Why did Haugen come into the light?

Haugen stepped into the light
Haugen stepped into the light

Haugen said "one after another" has tried to resolve Facebook's problems, without success.

"Imagine you know what's going on inside Facebook and you know no one on the outside knows. I know what my future will look like if I stay inside Facebook," she said.

After joining the company in 2019, Haugen said she decided to take action this year and had begun copying tens of thousands of documents from Facebook's internal systems. Which, she believes, shows that Facebook hasn't made any significant progress in combating online hate and misinformation.

"At some point in 2021, I realized, 'Okay, I'm going to have to do this systematically, and I've got to get out far enough that nobody can suspect this is real or not. ".

Frances Haugen's interview with CBS comes weeks after she provided thousands of pages of internal documents to the US government's Securities and Exchange Regulatory Commission, as well as to the Wall Street Journal.

Facebook and violence


Haugen said Facebook contributed to ethnic violence, alluding to incidents in Myanmar. In 2018, Facebook also admitted that its platform was used to "divide and incite violence offline" regarding the situation in the country.

"When we live in an information environment filled with angry, hateful, polarizing content, it erodes the trust of our citizens, it erodes, Haugen said on television. our trust in each other, it erodes our ability to care about each other.The version of Facebook that exists today is dividing our societies and fueling ethnic violence around the world. ".

During the January 6 riots in the US, a crowd of right-wing protesters stormed the Capitol after Facebook disbanded the Civic Integrity group of which Haugen was a member. The group specializes in election-related issues around the world, and its staff was dispersed to other Facebook units after the US presidential election.

"They told us, 'We're dissolving Civic Integrity.' They basically said, 'Oh well, we passed the election. No riots. We can get rid of Civic Integrity. right now,'" she recalls.

And when Facebook removed Civic Integrity, that's when Haugen thought, "I don't believe they're willing to actually invest what it takes, to keep Facebook from being jeopardized."

Facebook's 2018 Algorithm Change


Facebook's 2018 Algorithm Change
Facebook's 2018 Algorithm Change

Facebook changed the algorithm on its news feed - the platform's central feature that provides users with customized content like friends' photos and news stories - to prioritize content that's increased in popularity. user interaction.

This makes divisive content more prominent, says Haugen.

"One of the consequences of the way Facebook picks out that content today is that it's optimizing for content that garners engagement or reaction. But their own research is showing that hateful content. Hatred, divisive, polarizing... is what inspires people to be angry more than any other emotion."

She added: "Facebook has realized that if they change their algorithm to be safer, people will spend less time on the site, they will click on fewer ads, they will make less money." .

Haugen said European political parties have reached out to Facebook to say that changing the news feed is forcing them to take more extreme political stances to get users' attention. Describing the politicalists' concerns, she said: "You're forcing us to take positions that we don't like, which we know are harmful to society. We know if they're bad. If I don't take those positions, we're not going to win the social media market."

Responding to these reports, a Facebook representative said: "Every day, our teams must balance between protecting the right of billions of people to express themselves openly, with the need to keep Our platform is a safe and positive place."

The social media company says it has been making significant improvements to tackle the spread of misinformation and harmful content.

Frances Haugen will testify before a subcommittee of the US Senate. The subject of the hearing was "Protecting Children Online," focusing on Instagram's impacts on the mental health of young users.

Google Tech News - The Guardian

Post a Comment

Previous Post Next Post