Recommended

Mark Zuckerberg refutes whistleblower's claim that Facebook 'stokes division,' prioritizes profits

Facebook CEO Mark Zuckerberg is seen on stage during a town hall at Facebook's headquarters in Menlo Park, California, September 27, 2015.
Facebook CEO Mark Zuckerberg is seen on stage during a town hall at Facebook's headquarters in Menlo Park, California, September 27, 2015. | (Photo: Reuters/Stephen Lam/File Photo)

Facebook CEO Mark Zuckerberg has refuted claims by a whistleblower and former product manager who testified before Congress Tuesday about how the social media giant is prioritizing profits instead of stopping the spread of misinformation and hate. 

In a post to blog post Tuesday, Zuckerberg claimed that the media coverage surrounding the allegations presented by Frances Haugen "misrepresents our work and our motives." He further stated that "many of the claims don't make any sense."

"At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That's just not true," Zuckerberg exclaimed.

Get Our Latest News for FREE

Subscribe to get daily/weekly email with the top stories (plus special offers!) from The Christian Post. Be the first to know.

Haugen, the whistleblower and former Facebook manager who revealed her identity on CBS' "60 Minutes" on Sunday after leaking internal documents to the Wall Street Journal, testified before Congress Tuesday. She claims the social media giant is "lying to the public" about the harmful effects its platforms can have on people and society. 

Haugen alleged before the Senate Commerce Subcommittee on Consumer Protection that Facebook and its subsidiary Instagram are aware of how to make their apps safer, but those who can bring about changes to the platforms are ignoring calls for reform. 

Furthermore, she claimed that platforms "stoke division" and "harm children."

"I joined Facebook in 2019 because someone close to me was radicalized online. I felt
compelled to take an active role in creating a better, less toxic Facebook," she wrote in written testimony.

"During my time at Facebook, first working as the lead product manager for Civic Misinformation and later on Counter-Espionage, I saw that Facebook repeatedly encountered conflicts between its own profits and our safety. Facebook consistently resolved those conflicts in favor of its own profits." 

Haugen argues that in the same way there is government regulation on tobacco, automobiles and opioids as public safety concerns, there should also be governmental restrictions on social media platforms. 

"I implore you to do the same here," Haugen said. "Congressional action is needed. They won't solve this crisis without your help." 

Democrats and Republicans present at the hearing seemed to agree with Haugen. 

During the hearing, Haugen provided various documents that she copied from Facebook when she was employed by the company.

"These documents that you have revealed provided this company with a blueprint for reform, specific recommendations that could have made Facebook and Instagram safe," Sen. Richard Blumenthal, D-Conn., the subcommittee chair, chimed in at the meeting. "Facebook exploited teens using powerful algorithms that amplify their insecurities. Their profit was more important than the pain they caused."

During her appearance on "60 Minutes," she said that in 2018, Facebook made a change to its algorithms and programming that decides what users see on their Facebook news feeds in a way that optimizes engagement.

"But what its own research is showing is that content that his hateful, divisive, that is polarizing, its easier to inspire other people to anger than it is to other emotions," she said, adding that such content is "enticing" and keeps users on the platform. 

She said the company had enacted a few safeguards leading up to the 2020 election but removed them following the election results to increase growth on the platform. 

"Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less money," she added.

Haugen's lawyer confirms that the ex-whistleblower has filed at least eight complaints with the Securities and Exchange Commission related to how the platform's algorithms amplified "misinformation."

Zuckerberg defended the changes made to the Facebook news feed. 

"This change showed fewer viral videos and more content from friends and family — which we did knowing it would mean people spent less time on Facebook, but that research suggested it was the right thing for people's well-being," Zuckerberg stressed. "Is that something a company focused on profits over people would do? The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don't want their ads next to harmful or angry content."

For two years, Haugen worked alongside a team to combat political misinformation. However, she was allegedly "disillusioned" by the company's push for growth regardless of the lack of safety measures she said could have been taken. 

"As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable," Haugen reported in her testimony. "Until the incentives change, Facebook will not change. Left alone, Facebook will continue to make choices that go against the common good." 

Last month, a compilation of alleged internal research and communications was leaked by The Wall Street Journal, which published an investigative series on the harms the social media platforms.

Internal studies have shown that many teenage girls have reported increased issues with mental health as a direct result of Instagram usage. Instagram and Facebook apps have also reportedly been used by human traffickers and drug cartels. 

Zuckerberg directly responded to claims that Facebook ignored research on the negative impacts of its platforms. 

"If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?" he asked. "If we didn't care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us?"

"If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we're doing?" he asked. "And if social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the US while it stays flat or declines in many countries with just as heavy use of social media around the world?"

A Facebook spokesperson said in a statement provided to CNN that the company seeks to "balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place."

Neil Potts, Facebook's vice president of trust and safety policy, further denied the allegations in an interview with NPR on Monday. 

"I think that accusation is just a bit unfounded," Potts said. "At Facebook, what we are designing our products to do is to increase meaningful experiences, so whether those are meaningful social interactions ... or having just positive social experience on a platform, that is what we want the product ultimately to provide. That makes an environment where people will come to Facebook, where they will come to Instagram, and have a better time, and that's really our bottom line."

Was this article helpful?

Help keep The Christian Post free for everyone.

By making a recurring donation or a one-time donation of any amount, you're helping to keep CP's articles free and accessible for everyone.

We’re sorry to hear that.

Hope you’ll give us another try and check out some other articles. Return to homepage.

Most Popular