If you were horrified by how easy it was for someone to create a fake viral video that made Nancy Pelosi look as if she was slurring her words on stage, you weren’t alone. Facebook is now taking action to stop the misleading video, which has over 2.5 million views, from spreading further — though it won’t go as far as deleting it entirely.
On Wednesday, a video that apparently showed the Democratic congresswoman seemingly “drunk” went viral, first on Facebook, and then on other social media platforms. But Pelosi wasn’t actually impaired while speaking at a Center for American Progress event. Instead, the creator (whose identity is not known) simply slowed down the footage of the speaker of the house to 75 percent, and altered the pitch.
That was it. No A.I., no complicated algorithm, no scary “deepfakes.” Simple video editing was all someone needed to get millions of views and inflame the anti-Pelosi base in the process.
Now, Facebook is dealing with the video within the framework it has set up for assessing and mitigating fake content. Facebook confirmed to Mashable that it is reducing the video’s presence in News Feed, since a Facebook fact-checking partner has flagged it as “false.” The video will still exist on the platform, but the News Feed algorithm won’t pick it up and place it in people’s feeds.
Facebook explained that it’s not removing the video because it didn’t violate its community standards. By Facebook’s rules, the information you post doesn’t have to be true. But, Facebook says it is continually working to improve the integrity of the platform by mitigating the viral spread of misleading content and adding context to flagged content.
Although, the context in this case is hard to find.
Facebook said it would show related articles to point out the content is untrue, but none of the content in a “related videos” sidebar does so. Only when a user presses “share” does a box warning about the veracity of the video appear.
The ‘Related Videos’ don’t show additional context; Facebook presents the fake video, here, without comment.
Image: screenshot: jack morse/mashable
Why not just call it “fake,” tho?
Image: screenshot: rachel kraus/mashable
Facebook’s stance here is that even if a person has a right to post a false piece of content, that doesn’t mean it’s Facebook’s duty to promote that content by giving it wide distribution via News Feed.
Unfortunately, the damage has already been done. As of yesterday evening, the video had already been viewed over 2 million times. Just 12 hours later, that count is up to 2.5 million, and there are multiple clones on Facebook and other platforms. Facebook will treat duplicate videos in the same way — when it finds them. Once again, Facebook is playing a game of catch up with bad actors on its platform, with systems that catch — not prevent — harmful content, after it’s already multiplied and gone viral.
Facebook recently shared that it had deleted over 2 billion fake accounts this year. It is under siege by bad actors and scammers trying to manipulate the platform. And though Facebook’s reasoning about down-ranking, not deleting, makes sense, that might not be a heavy enough hammer to stop content manipulators, and their simple video-editing software, in their tracks.