01 logo

Facebook whistleblower Frances Haugen testifies before the Senate

After revealing her identity on Sunday evening, Frances Haugen, the whistleblower who leaked controversial facebook documents to The Wall Street Journal - testified before Senate Committee on Commerce, Science, and Transportation on Monday.

By Damian PetersPublished 3 years ago 6 min read
Like

After revealing her identity on Sunday evening, Frances Haugen, the whistleblower who leaked controversial facebook documents to The Wall Street Journal - testified before Senate Committee on Commerce, Science, and Transportation on Monday.

Haugen's testimony was given after a hearing last week in which Safety Antigone Davis, the global head of Safety at Facebook, was asked about the company's negative effects on children and teens. Davis remained faithful to Facebook's instructions, which frustrated senators because she didn't answer direct questions. Haugen, who was a Facebook project manager for civic misinformation, was more open to information.

Haugen, an algorithm specialist, has worked as a project manager for companies such as Google, Pinterest, and Yelp. She was a Facebook project manager and addressed issues related to democracy.

Haugen stated in her opening statement that she had worked on four types of social networking sites and understood the complexity and nuance of these problems. "But, the choices made by Facebook are dangerous -- for our children and our safety as well as for our democracy and privacy -- and we must insist that Facebook makes changes."

Haugen stated that she believes that Facebook's current algorithm which rewards posts that produce meaningful social interactions (MSIs) is dangerous. This news feed algorithm was created in 2018. It prioritizes interactions (such comments, likes and likes) from people Facebook considers your closest friends and family.

However, the documents leaked from Haugen reveal that data scientists have raised concerns about "unhealthy side effect on important pieces of public content," such as news and politics.

Facebook also uses engagement-based rating, which allows an AI to display the content it believes will be most popular with individual users. This means that content that is more popular will be ranked higher, which can lead to misinformation, toxicity, and violent content. Haugen stated that she believes chronological ranking would mitigate these negative effects.

"I spent the majority of my career working with systems such as engagement-based ranking. Haugen stated that he was essentially blaming 10 years worth of his own work when he spoke out in front of the hearing.

Haugen explained to "60 Minutes" Sunday night that she was part of a civic ethics committee that Facebook disbanded after the 2020 election. Facebook put in place safeguards to prevent misinformation before the 2020 U.S. Presidential election. These safeguards were turned off after the election. After the attack on the U.S. Capitol in January 6, Facebook turned them back on.

"Facebook altered those safety defaults during the campaign because they knew they were unsafe. They wanted to see that growth again after the election so they went back to their defaults," Haugen stated. "I find that deeply problematic."

Haugen stated that Facebook emphasizes a false choice. They can use their volatile algorithms to continue their rapid growth or prioritize user safety and decline. She believes that Facebook could benefit from more safety measures like government agencies, academics, and researchers, which would help to improve its bottom line.

"The thing that I want is a shift [away] form short-termism, which is what Facebook runs under today. Haugen stated that the company is being managed by metrics, not people. It's possible for Facebook to be more profitable five or ten years down the line if there's proper oversight and some of these restrictions. This is because it was not as toxic and not as many people quit.

Haugen was asked to "think experiment" what she would do in the shoes of CEO Mark Zuckerberg. She said that she would create policies for sharing information with oversight agencies, including Congress. She would also work with academics to ensure they have the data they need to conduct research on the platform. And that she would implement "soft interventions" to preserve the integrity and integrity of 2020 elections. Since other companies, such as Twitter, have used these interventions to reduce misinformation, she suggested that users be required to click on the link before sharing it.

Haugen said that Facebook's current structure is not sufficient to prevent misinformation spreading via vaccines. Facebook relies too heavily on AI systems, which Facebook claims will never catch more that 10% to 20%.

Haugen later told the committee she encouraged the reform of Section 230. This section is a part the United States Communications Decency Act which exempts social media platforms and users from liability for their posts. Haugen believes that Section 230 should be exempted from decisions regarding algorithms. This would allow companies to face legal consequences if they are found to have caused harm.

Companies have less control over user-generated content. Haugen stated that companies have full control over their algorithms. Facebook should not be allowed to make decisions about growth, virality, and reactiveness that are detrimental to public safety.

Senator John Hickenlooper (D-CO), asked how Facebook's bottom lines would be affected if the algorithm promoted safety. Haugen stated that the algorithm would have an impact because users will spend more time on Facebook if they see more engaging content, even if it is more enraging than entertaining. This results in more advertising dollars for Facebook. She believes the platform could still be profitable if it took the steps she suggested to improve user safety.

reported this in one of The Wall Street Journal’s Facebook Files stories. Facebook employees raised concerns about the platform being used overseas for violent crime, but the company's response was not adequate, according to documents Haugen leaked.

Employees expressed concerns about the possibility that armed groups in Ethiopia could use Facebook to coordinate attacks on ethnic minorities. Facebook's moderation policies are dependent on artificial intelligence. This means that the AI must be able function in all languages and dialects that its nearly 2.9 billion monthly active users use. According to the WSJ Facebook's AI systems do not cover all languages. Haugen stated that even though only 9% Facebook users speak English fluently, 87% of misinformation spending on the platform is directed to English speakers.

Haugen stated that Facebook seems to invest more in users who have the highest incomes, even though there may be a greater danger if the profits are not evenly distributed. She also stated that she believes Facebook's inordinately low staffing for counter-espionage and information operations teams is a national security risk, something she's sharing with Congress.

The Senate committee members indicated that they are motivated to take action against Facebook. Facebook is currently in an antitrust lawsuit.

Haugen stated that he was against Facebook's dissolution. "If you separate Instagram and Facebook, it's likely that the majority of advertising dollars will go Instagram. And Facebook will continue to be this "Frankenstein" that is endangering lives all over the globe, but now there won't even be enough money to fund it."

Critics argue that the six-hour Facebook downtime yesterday -- which was unrelated to today’s hearing -- shows the limitations of one company's control, particularly when platforms such as WhatsApp are so important to international communication.

Legislators are currently working on legislation to improve safety on social media platforms for minors. The KIDS Act (Kids Internet Design and Safety) Act was announced last week by Sen. Ed Markey (D–MA). This bill seeks to provide new protections for users who are under 16 years old. Today, Sen. John Thune (R.S.D.) introduced a bipartisan bill that he and three other members of the committee introduced in 2019. It is called the Filler Bubble Transparency Act. This legislation would allow users to see content that is not curated by secret algorithms.

Senator Blumenthal suggested that Haugen be brought back to another hearing on her concerns about Facebook's threat to national security. Despite the fact that Facebook's higher-ups spoke against Haugen, policymakers were moved by her testimony.

social media
Like

About the Creator

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.