Somehow it feels slightly odd to feel sympathetic for a multi-billionaire, but Facebook’s CEO Mark Zuckerberg was already having a bit of a tough time before last night’s global outage of all his platforms - Facebook, Instagram and Whatsapp - which apparently lasted about 6 hours. Still, it’s one for the back pocket in case any of our servers are ever down ‘...even the big players have problems - remember when Facebook went offline...”. (Thankfully I can’t even remember the last time).

It’s an embarrassment that probably couldn’t have come at a worse time for the beleaguered Mr Zuckerberg and his social media empire. He’s been on the receiving end of a growing wave of criticism for failing to deal effectively with the tide of abuse - from cyber-bullying and hate speech to political manipulation and misinformation - that saturates his platforms with alarming frequency and a lack of proactive action to remove it. This has resulted in him being hauled up in front of the US Senate to basically obfuscate and deny. However, denial has become that bit harder with the publication this weekend of “tens of thousands” of pages of internal research documents from Facebook in the Wall Street Journal showing that the company knows precisely how toxic its own product is for the people who use it.

Whistleblower Frances Haugen, a former product manager in Facebook’s now-disbanded Civic Integrity Unit, provided the documents to The Wall Street Journal and gave an interview with CBS News’ 60 Minutes about the ways Facebook is poisoning society. She has testified to Congress today and has filed complaints with the American Securities and Exchange Commission alleging that Facebook has lied to shareholders about its own product.

Haugen alleges that there’s a basic conflict between what’s good for Facebook and what’s good for society at large. In the CBS interview she explained that things that are good for Facebook tend to be bad for the world we live in, and that there are many of them.

1. Facebook’s algorithm intentionally shows users things to make them angry because that causes the most engagement and user engagement is what Facebook turns into advertising revenue. This is not unique to Facebook of course, but Haugen, who has previously worked at Pinterest and Google, insisted that Facebook is uniquely awful:

“I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before,”

2. Facebook dissolved the 'Civic Integrity Unit' that Haugen worked in after the 2020 US election. The unit was created to combat political misinformation but Facebook seemed to think they were in the clear after the U.S. presidential election in November 2020 and closed it down. Subsequently the Capitol insurrection took place on January 6th 2021, undoubtedly fueled by misinformation published on the platform.

“They told us, ‘We’re dissolving Civic Integrity.’ Like, they basically said, ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast forward a couple months, we got the insurrection,” Haugen said.

It should be noted that Facebook’s impact isn’t just on American democracy though, it’s global. One of the documents Haugen smuggled out of the company shows that political parties in Europe had to start running negative ads to get any engagement on Facebook.

Haugen explained that parties complained that; “You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media.”

3. Facebook only identifies a tiny fraction of hate and misinformation on the platform. One of the internal research studies leaked by Haugen shows that it only identifies 3 to 5 percent of hate on the platform and less than 1% of violence and incitement. However Zuckerberg has stated he believes that Facebook is the best in the world at identifying hate and incitement on social media. In fact the company seems to think the real problem is the internet itself, according to a recent Facebook statement:

“If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago. We have a strong track record of using our research — as well as external research and close collaboration with experts and organizations — to inform changes to our apps.”

4. Facebook owns Instagram, and as the Facebook's own documents leaked by Haugen show, they know 13.5% of teen girls say Instagram makes thoughts of suicide worse, and 17% say it makes their eating disorders worse.

“What’s super tragic is Facebook’s own research says, as these young women begin to consume this— this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more,” Haugen told CBS.

This is all part of the business model that Facebook is making lots and lots of money from. Unsurprisingly Facebook takes a very different view:

“We do internal research to ask hard questions and find out how we can best improve the experience for teens and we will continue doing this work to improve Instagram and all of our apps. It is not accurate that leaked internal research demonstrates Instagram is ‘toxic’ for teen girls. The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced. This research, like external research on these issues, found teens report having both positive and negative experiences with social media,” Lena Pietsch, Director of Policy Communications.

5. Haugen said people who work at Facebook aren’t bad people (which sounds like the kind of thing someone who has worked at Facebook might say) but they do have perverse incentives:

“No one at Facebook is malevolent, but the incentives are misaligned. Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume. I have a lot of empathy for Mark. And Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side effects of those choices are that hateful, polarizing content gets more distribution and more reach.”

But what matters surely, is how the platforms are being used and abused today? Zuckerberg exerts almost complete control over Facebook, more than in any other listed company. He sets the tone and makes the choices. If he knew something was wrong he could change it. Facebook is exactly as Mark Zuckerberg wants it to be and he evidently sees it a force for good in the world and that its benefits far outweigh any problems it creates. He must genuinely believe that as there’s surely no way he’d intentionally destroy individual lives and undermine democracy.

Facebook frequently tries to point out that the problem is with the internet not them, and that it is people who share content. They claim that the company is working hard to combat misinformation, hate, and fake news. In a statement to the Wall Street Journey the company said:

“Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.” Andy Stone, Facebook.

The problem with Facebook’s argument is that, as the whistleblower has shown, Facebook knows exactly how bad it is for the world. Everything Facebook does is to increase engagement, and that’s the content that gets the most of it. Putting an end to all of that terrible content on its platform would be technically very hard (as Zuckerberg recently admitted), but even if it wasn’t, it would be bad for business. Helping people connect with each other is undoubtedly a good thing, but it comes at a cost. Mark Zueckberg knows exactly what that cost is, but has evidently decided it’s worth it to keep the profit machine going.

“Ultimately, Zuckerberg has shown that he isn’t objective about the effects of what he built on the world. He has shown that he isn’t able to see Facebook for what it is, rather, he still views it as some idealistic platform that only exists in his own mind. That’s a problem, because Zuckerberg’s lack of awareness has real-world consequences. If Zuckerberg were to hand over control of the company he started in his Harvard dorm room, he’d still be fantastically rich. He’d still be a part of one of the most incredible Silicon Valley success stories ever. He’d go down as one of the world’s most successful entrepreneurs of all time. He’d just be giving up control of one of the world’s largest and most important social platforms. As hard as that might be, if Zuckerberg really cares about making Facebook a force for good, getting out of the way might just be the one thing that could make that happen.” Jason Aten, Inc.