Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Livestreams Undermine Social Media Platforms' Efforts To Fight Misinformation

ARI SHAPIRO, HOST:

Tech platforms took their most aggressive stance against election misinformation last week, flagging tweets and putting warning labels on questionable allegations, and yet false claims and conspiracy theories still reached a lot of people. One reason - live video. NPR tech correspondent Shannon Bond joins us now to explain.

Hi, Shannon.

SHANNON BOND, BYLINE: Hey, Ari.

SHAPIRO: Start off by telling us about some of these live videos that have popped up since last week. Like, what do you see in them?

BOND: Well, how much time do you have? So let's start with Election Day. There were fake results that were livestreamed on YouTube, which is the video platform owned by Google. And one of these livestreams actually reached the top five on Google's results for vote tallies in swing states. And, you know, that meant that hundreds of thousands of viewers saw some of these streams before YouTube eventually took them down.

We also are seeing live commentary from influencers, sort of like what you would see on cable news, right? And so these influencers will push these false narratives about voter fraud. They'll show tweets. They'll play other video clips. And they often frame them as saying, well, I'm just asking questions or people on the Internet are saying this.

Another tactic we see is what is effectively fake live video. So people post video from something that happened last year or take a video from a protest in one city and claim it's live from another city, and they'll use trending hashtags on those videos to reach a lot of people.

SHAPIRO: I think we're all familiar now with how social media companies can crack down on text and photos that are still - that stay up for a while, and you pull them down. How do you do that with live video? I mean, it seems like a unique challenge.

BOND: Yeah. I mean, we did see these companies move really quickly last week, you know, labeling posts from President Trump and his supporters making false claims. But live video is harder to stop the spread of. For one reason, it's live, and you can't scan it as quickly as text. Here's what Evelyn Douek told me. She's at Harvard Law School. She studies the platform's rules.

EVELYN DOUEK: It's harder to search video content as opposed to text. And so it's a lot harder to scrutinize what's going on, and it's a lot more time consuming.

BOND: And so these videos, you know, they may eventually get taken down by the platforms or a label is put on saying they're misleading, but in the meantime, they reach lots and lots of viewers.

SHAPIRO: What do the companies say they're doing about this?

BOND: Well, I should say, first of all, that Google, Facebook and TikTok are all among NPR's financial supporters. Now, Facebook and YouTube both say they are taking steps to limit the reach of misleading claims made on live video, you know, by limiting distribution or not recommending them. But they're still slow at this.

I mean, take last week - former White House chief strategist Steve Bannon went on the livestream on Facebook and YouTube, and he called for Dr. Anthony Fauci and FBI Director Christopher Wray to be beheaded. YouTube and Facebook both deleted this video, but only hours after it had gone live, lots and lots of people had seen it. The companies also penalized Bannon's account. He can't post new posts or new videos. But that was a less aggressive stance than Twitter, which permanently suspended Bannon's account.

You know - and, Ari, another tricky challenge for these platforms is, you know, a video doesn't just stay on one platform. So in many cases, somebody will take a snippet of a live feed from YouTube - right? - with, you know, baseless rumors and, you know, spreading these sort of claims. They'll put that up on TikTok. Maybe TikTok takes it down because it also doesn't allow some of these misleading claims. But those things get reposted, and it's just really hard to stamp out.

It's so hard for any individual company to take really effective action kind of against the proliferation of these false claims because even after they've been debunked, they pop up again and again.

SHAPIRO: That's NPR correspondent Shannon Bond.

Thank you, Shannon.

BOND: Thank you.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.
KUER is listener-supported public radio. Support this work by making a donation today.