Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Reporters Reveal 'Ugly Truth' Of How Facebook Enables Hate Groups And Disinformation

Joel Saget
/
AFP via Getty Images

In a new book, two New York Times journalists report that Facebook CEO Mark Zuckerberg often doesn't see the downside of the social media platform he created. In their new book, An Ugly Truth, Sheera Frenkel and Cecilia Kang write that Zuckerberg tends to believe that free speech will drown out bad speech.

"[Zuckerberg's] view was that even if there were lies [on Facebook] — lies from a politician such as Donald Trump — that the public would respond with their own fact checks of the president and that the fact checks would rise to the top," Frenkel says.

Frenkel, who is based on San Francisco, covers cybersecurity for The Times; Kang is based in Washington, D.C., and covers technology and regulatory policy. Their book focuses on the period between the 2016 presidential campaign and the Jan. 6 insurrection at the U.S. Capitol — a time in which Trump became one of Facebook's most profitable users.

"Trump had over 30 million followers," Frenkel tells Fresh Air. "He not only managed to bring audience and relevancy to Facebook, he created this constant sort of churning stream of information that people couldn't help take their eyes off of."

Following the 2020 presidential election, the Facebook platform became key in the "Stop the Steal" effort to challenge the election results, with users posting photos of assault rifles and openly discussing how they were going to bring guns to Washington on Jan. 6.

"I had never seen a Facebook group grow so quickly, adding thousands of users within hours to this group in which they were sharing all sorts of falsified videos and documents about election fraud," Kang says. "It's very clear from our reporting that Facebook knew the potential for explosive violence was very real [on Jan 6]."

It's very clear from our reporting that Facebook knew the potential for explosive violence was very real [on Jan 6].

Kang and Frenkel say that the company debated having Zuckerberg call Trump to try to defuse the Jan. 6 rally ahead of time, but it ultimately decided not to do so. After the insurrection, Facebook suspended Trump's account for two years, saying it will reinstate him only if "the risk to public safety has receded."

Kang notes that the fact that Trump is no longer in office has helped Facebook avoid an extensive discussion of the ban. But political disinformation remains a problem for the social media platform, which has nearly 3 billion global users.

"There are elections coming up in a number of countries where the current head of state is very active on Facebook and uses Facebook much in the way that was modeled by Donald Trump," Kang says. "Millions of people all over the world are being affected in democracies that are being threatened by populist leaders using Facebook."

Facebook responded to the assertions in An Ugly Truth with the following statement:

"Every day, we make difficult decisions on where to draw the line between free expression and harmful speech, on privacy, security, and other issues, and we have expert leaders who engage outside stakeholders as we craft our policies. But we should not be making these decisions on our own and have for years advocated for updated regulations where democratic governments set industry standards to which we can all adhere."


Interview Highlights

An Ugly Truth, by Sheera Frenkel and Cecilia Kang
/ HarperCollins
/
HarperCollins
An Ugly Truth by Sheera Frenkel and Cecilia Kang

On Facebook's decision to ban Trump from its platform following the Jan. 6 insurrection

Cecilia Kang: There was immediate sort of understanding that this was a watershed moment and that they were going to have to have the discussion they dreaded having for a very long time, which is do they remove Donald Trump? And we see them debate that. We see them go back and forth. And really, it's not until Twitter takes action to ban Trump that Facebook sort of makes its announcement — at first that it's a temporary suspension. It's very unclear and muddled. Their messaging is that we're removing him for now, but we're going to reevaluate. And ultimately, it's finally announced that they're going to suspend the account, but they're going to refer it to the Facebook Oversight Board. They're essentially really, again, kicking the can to someone else and saying, "We've created this outside body. I'm going to allow them to rule on whether or not we should have removed Donald Trump."

They have said that for at least two years, Trump will be banned and that two years expires, essentially ahead of his ability to campaign again for the 2024 campaigns.

Sheera Frenkel: The ban was for a couple of weeks. The language was quite interesting. It was indefinite and temporary is the way they described it. They referred it to this body that they describe as a Supreme Court, [a] third-party body that makes decisions on content. Interestingly, months later, the body, the Facebook Oversight Board, kicked it back, that decision on Trump to Facebook and they said, "Facebook, you don't have policies that are clear enough on this kind of political speech and taking down an account like Trump, you have to write those policies." It was actually a pretty smart move by the Facebook Oversight Board. So currently the final decision on Trump is in the hands of Facebook. They have said that for at least two years, Trump will be banned and that two years expires, essentially ahead of his ability to campaign again for the 2024 campaigns.

On how social media companies are often creating policies about misinformation on the fly

<em>New York Times</em> journalists Cecilia Kang and Sheera Frenkel are the co-authors of <em>An Ugly Truth. </em>
Beowulf Sheehan / HarperCollins
/
HarperCollins
New York Times journalists Cecilia Kang (left) and Sheera Frenkel are co-authors of An Ugly Truth.

Frenkel: The social media companies are all struggling, and they're creating policies as we go. I will say that Twitter, though it's much smaller ... compared to Facebook, especially when you put Facebook together with its other apps, WhatsApp and Instagram, Twitter is willing to be more experimental. It's quite public in its approach and writing of its policies. I'm not saying that they've got it completely right. YouTube is still very far behind. These social media companies are all struggling with how to handle misinformation and disinformation, and along the lines of misinformation, it is a very current and present danger in that just recently, the chief of staff of the White House, Ron Klain, was saying that when the White House reaches out to Americans and asks why aren't they getting vaccinated, they hear misinformation about dangers with the vaccine. And he said that the No. 1 place where they find that misinformation is on Facebook.

On Facebook's updated political ad policy

Kang: Facebook still allowed politicians to post ads without being fact-checked. And, in fact, politicians could say things in advertisements that the average Facebook user could not. They did change other things in the platform. For instance, they created an ad library where you could search for ads and see what politicians were posting. And that was a level of transparency they hadn't previously had. However, they did double down, and they did maintain firm in their belief that politicians could say things in ads without the benefit of a fact check.

On Zuckerberg and Facebook Chief Operating Officer Sheryl Sandberg's belief that people would discern lies from truth

Kang: Up until the end of Trump's presidency, Mark Zuckerberg and Sheryl Sandberg were still defending the idea that Trump — and really political leaders all over the world — could and should say things on the platform as they wished, and people could and should respond as they wished. They failed to see what their own employees were telling us. ... One of the most fascinating things was talking to employees within Facebook who are raising the alarm again and again and again and saying, "This is a problem. We are spreading misinformation. We are letting the president spread misinformation and it's being amplified by our own algorithms. Our systems aren't working the way we predicted and we should do something." And yet, you know, Mark Zuckerberg and Sheryl Sandberg stay the course.

On the difficulty of moderating hate speech

Frenkel: We have to remember that the scale of Facebook is nearly 3 billion users around the world. The amount of content that courses through the platform every day is just so enormous. Facebook has put in AI, artificial intelligence, as well as hired thousands of content moderators to try to detect this. But they're really far behind, and they've only really started taking this seriously since public scrutiny has shed a spotlight on the company and there is demand for change within the company. So our reporting shows, and [we're hearing] from the people inside, that they really do feel like they are racing to catch up.

Kang: And I would just add that a lot of this hate speech is happening in private groups. This is something Facebook launched just a few years ago, this push towards privacy, this push towards private groups. The idea being is that people wanted to be in small, intimate groups with like-minded people. But what happens when those small, intimate groups are QAnon or when they're malicious? Everyone is like-minded, and so no one is reporting the content. In some cases, it's not a matter of Facebook's algorithms not finding things. It's a matter of Facebook creating these kind of secluded, private, walled gardens where this kind of talk can happen, where hate speech can happen and it's not being found.

Sam Briger and Thea Chaloner produced and edited the audio of this interview. Bridget Bentz and Molly Seavy-Nesper adapted it for the web.

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2021 Fresh Air. To see more, visit Fresh Air.

Tags
Combine an intelligent interviewer with a roster of guests that, according to the Chicago Tribune, would be prized by any talk-show host, and you're bound to get an interesting conversation. Fresh Air interviews, though, are in a category by themselves, distinguished by the unique approach of host and executive producer Terry Gross. "A remarkable blend of empathy and warmth, genuine curiosity and sharp intelligence," says the San Francisco Chronicle.
KUER is listener-supported public radio. Support this work by making a donation today.