Pelosi’s Manipulated Video: Why We Prefer Fiction to Fact
Last week it was hard to avoid the video of Nancy Pelosi who had been slowed down to make her look drunk, then tweeted by presidential attorney Rudy Giuliani and shared widely on Fox News and social media. Faced with an expert consensus that the video had been manipulated, YouTube took it down. This is not the case of Twitter – which declined to comment on their inaction – or Facebook. And while much of the important debate has focused on whether this video should be removed from social media sites, not enough has been addressed about how and why we are so willing to embrace lies rather than facts.
In a meeting Along with Anderson Cooper, Monika Bickert, Facebook’s vice president for product policy and counterterrorism, tried to explain why the company didn’t take down the video. She pointed out that Facebook employs more than 50 fact-checking organizations around the world. She argued that when people advocate violence or make other direct threats to people’s safety, those posts get taken down. She claimed that Facebook had removed more than 3 billion fake accounts since January and (in a political nuance only she knew the significance of) said that if the fake video was posted by a fake account rather than a real person , have been dismantled. But when Cooper pressed her to explain why Facebook would remove the fake accounts but not the fake videos, her response was disingenuous at best.
“We’re drastically reducing the visibility of this content and letting people know it’s fake so they can make an informed choice,” she (falsely) claimed. In fact, Facebook users who try to share the video now see a appear telling them that “there are additional reports available on this” from PolitiFact, 20 Minutes, Factcheck.orgLead Stories and Associated Press.
Is it better than nothing? Sure. But most people, let alone the people most likely to think Nancy Pelosi is incompetent, drunk or demented, are they likely to go to one of these sites looking for these “reports. additional”? Of course not.
But what’s more troubling, and certainly not unique to Facebook, is the worldview represented when she said, “We think it’s important for people to make their own informed choice about what to do. believe…” In essence, she says: We will post videos that we know are manipulated to mislead and we will let viewers choose to believe the truth or the lie.
The problem with this approach is that, as historian and “Sapiens” author Yuval Noah Harari argues in a recent New York Times editorial, humans are particularly predisposed to creating and believing lies. In fact, it can be socially useful, as adherence to shared mythos enables large-scale cooperation.
“… [L]large-scale cooperation depends on believing in common histories,” he explains. “But these stories don’t have to be true. You can unite millions of people by making them believe completely fictional stories about God, about race, or about the economy. Or about Nancy Pelosi.
Indeed, Harar argues that fiction tends to be more powerful than truth in uniting subgroups of people. And that is, after all, the overarching goal of most political communications.
If the purpose of a story is to help one tribe distinguish itself from another, a made-up story will be more differentiating than a truth. For example, we can all agree that Nancy Pelosi is two heartbeats away from the presidency; but the group that argues that she cannot “put a subject before a predicate in the same sentence” (Fox) differentiates itself as well as its followers.
[T]they perpetuate the idea that truth is whatever we choose to believe.
And, as Donald Trump learned from Infowars’ Alex Jones, if your goal is to use stories to both increase power and generate displays of loyalty, the more outrageous the story, the better. is. Almost everyone could have agreed that Hillary Clinton had connections in high places. But only the hardest core of the Trump and Breitbart base would publicly buy into the idea that she was exploiting a sex trafficking ring in the basement of a pizzeria. And once that’s done, they’re unlikely to back down. Instead, they will redouble their loyalty.
And therein lies the danger that massive news-distribution systems like Twitter and Facebook are spreading fiction alongside fact. In doing so, they perpetuate the idea that truth is whatever we choose to believe.
But pointing fingers at the distributors of lies will not lessen the dangers of credulity. We must also look within. As human beings, we are (to quote the famous behavioral economist, Dan Ariely)”predictable irrational.” We are driven by cognitive biases that cause us to more easily recall ideas, experiences, and beliefs that match our own and favor those we perceive to be most like us. Our logic is often overtaken by our thirst for consistency and validation. And all of these emotional needs and cognitive biases often combine to work against our own interests.
We may not be able to modify the policies of Facebook or Twitter. But if we are to make sound judgments and act for our individual and collective good, we must learn to live with ambiguity, with the cognitive dissonance evoked when reality does not conform to our beliefs. And then we have to act on what is real.