13 KiB
Alias | Tag | Date | DocType | Hierarchy | TimeStamp | Link | location | CollapseMetaTable | ||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
|
2024-04-28 | WebClipping | 2024-04-28 | https://www.newyorker.com/magazine/2024/04/22/dont-believe-what-theyre-telling-you-about-misinformation | true |
Parent:: @News Read:: 🟥
name Save
type command
action Save current file
id Save
^button-TheFakeFakeNewsProblemandtheTruthAboutMisinformationNSave
The Fake Fake-News Problem and the Truth About Misinformation
Millions of people have watched Mike Hughes die. It happened on February 22, 2020, not far from Highway 247 near the Mojave Desert city of Barstow, California. A homemade rocket ship with Hughes strapped in it took off from a launching pad mounted on a truck. A trail of steam billowed behind the rocket as it swerved and then shot upward, a detached parachute unfurling ominously in its wake. In a video recorded by the journalist Justin Chapman, Hughes disappears into the sky, a dark pinpoint in a vast, uncaring blueness. But then the rocket reappears and hurtles toward the ground, crashing, after ten long seconds, in a dusty cloud half a mile away.
Hughes was among the best-known proponents of Flat Earth theory, which insists that our planet is not spherical but a Frisbee-like disk. He had built and flown in two rockets before, one in 2014 and another in 2018, and he planned to construct a “rockoon,” a combination rocket and balloon, that would carry him above the upper atmosphere, where he could see the Earth’s flatness for himself. The 2020 takeoff, staged for the Science Channel series “Homemade Astronauts,” was supposed to take him a mile up—not high enough to see the Earth’s curvature but hypeworthy enough to garner more funding and attention.
Flat Earth theory may sound like one of those deliberately far-fetched satires, akin to Birds Aren’t Real, but it has become a cultic subject for anti-scientific conspiratorialists, growing entangled with movements such as QAnon and COVID-19 skepticism. In “Off the Edge: Flat Earthers, Conspiracy Culture, and Why People Will Believe Anything” (Algonquin), the former Daily Beast reporter Kelly Weill writes that the tragedy awakened her to the sincerity of Flat Earthers’ convictions. After investigating the Flat Earth scene and following Hughes, she had figured that, “on some subconscious level,” Hughes knew the Earth wasn’t flat. His death set her straight: “I was wrong. Flat Earthers are as serious as your life.”
Weill isn’t the only one to fear the effects of false information. In January, the World Economic Forum released a report showing that fourteen hundred and ninety international experts rated “misinformation and disinformation” the leading global risk of the next two years, surpassing war, migration, and climatic catastrophe. A stack of new books echoes their concerns. In “Falsehoods Fly: Why Misinformation Spreads and How to Stop It” (Columbia), Paul Thagard, a philosopher at the University of Waterloo, writes that “misinformation is threatening medicine, science, politics, social justice, and international relations, affecting problems such as vaccine hesitancy, climate change denial, conspiracy theories, claims of racial inferiority, and the Russian invasion of Ukraine.” In “Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity” (Norton), Sander van der Linden, a social-psychology professor at Cambridge, warns that “viruses of the mind” disseminated by false tweets and misleading headlines pose “serious threats to the integrity of elections and democracies worldwide.” Or, as the M.I.T. political scientist Adam J. Berinsky puts it in “Political Rumors: Why We Accept Misinformation and How to Fight It” (Princeton), “a democracy where falsehoods run rampant can only result in dysfunction.”
Most Americans seem to agree with these theorists of human credulity. Following the 2020 Presidential race, sixty per cent thought that misinformation had a major impact on the outcome, and, to judge from a recent survey, even more believe that artificial intelligence will exacerbate the problem in this year’s contest. The Trump and the DeSantis campaigns both used deepfakes to sully their rivals. Although they justified the fabrications as transparent parodies, some experts anticipate a “tsunami of misinformation,” in the words of Oren Etzioni, a professor emeritus at the University of Washington and the first C.E.O. of the Allen Institute for Artificial Intelligence. “The ingredients are there, and I am completely terrified,” he told the Associated Press.
The fear of misinformation hinges on assumptions about human suggestibility. “Misinformation, conspiracy theories, and other dangerous ideas, latch on to the brain and insert themselves deep into our consciousness,” van der Linden writes in “Foolproof.” “They infiltrate our thoughts, feelings, and even our memories.” Thagard puts it more plainly: “People have a natural tendency to believe what they hear or read, which amounts to gullibility.”
But do the credulity theorists have the right account of what’s going on? Folks like Mike Hughes aren’t gullible in the sense that they’ll believe anything. They seem to reject scientific consensus, after all. Partisans of other well-known conspiracies (the government is run by lizard people; a cabal of high-level pedophilic Democrats operates out of a neighborhood pizza parlor) are insusceptible to the assurances of the mainstream media. Have we been misinformed about the power of misinformation?
In 2006, more than five hundred skeptics met at an Embassy Suites hotel near O’Hare Airport, in Chicago, to discuss conspiracy. They listened to presentations on mass hypnosis, the melting point of steel, and how to survive the collapse of the existing world order. They called themselves many things, including “truth activists” and “9/11 skeptics,” although the name that would stick, and which observers would use for years afterward, was Truthers.
The Truthers held that the attacks on the Pentagon and the World Trade Center were masterminded by the White House to expand government power and enable military and security industries to profit from the war on terror. According to an explanation posted by 911truth.org, a group that helped sponsor the conference, George W. Bush and his allies gagged and intimidated whistle-blowers, mailed anthrax to opponents in the Senate, and knowingly poisoned the inhabitants of lower Manhattan. On that basis, Truthers concluded, “the administration does consider the lives of American citizens to be expendable on behalf of certain interests.”
“Out of this dispute, a clear leader will emerge.”
Cartoon by Frank Cotham
The Truthers, in short, maintained that the government had gone to extreme measures, including killing thousands of its own citizens, in order to carry out and cover up a conspiracy. And yet the same Truthers advertised the conference online and met in a place where they could easily be surveilled. Speakers’ names were posted on the Internet along with videos, photographs, and short bios. The organizers created a publicly accessible forum to discuss next steps, and a couple of attendees spoke to a reporter from the Times, despite the mainstream media’s ostensible complicity in the coverup. By the logic of their own theories, the Truthers were setting themselves up for assassination.
Their behavior demonstrates a paradox of belief. Action is supposed to follow belief, and yet beliefs, even fervently espoused ones, sometimes exist in their own cognitive cage, with little influence over behavior. Take the “Pizzagate” story, in which Hillary Clinton and her allies ran a child sex ring from the basement of a D.C. pizzeria. In the months surrounding the 2016 Presidential election, a staggering number of Americans—millions, by some estimates—endorsed the account, and, in December of that year, a North Carolina man charged into the restaurant, carrying an assault rifle. Van der Linden and Berinsky both use the incident as evidence of misinformation’s violent implications. But they’re missing the point: what’s really striking is how anomalous that act was. The pizzeria received menacing phone calls, even death threats, but the most common response from believers, aside from liking posts, seems to have been leaving negative Yelp reviews.
That certain deeply held beliefs seem insulated from other inferences isn’t peculiar to conspiracy theorists; it’s the experience of regular churchgoers. Catholics maintain that the Sacrament is the body of Christ, yet no one expects the bread to taste like raw flesh or accuses fellow-parishioners of cannibalism. In “How God Becomes Real” (2020), the Stanford anthropologist T. M. Luhrmann recounts evangelical Christians’ frustrations with their own beliefs. They thought less about God when they were not in church. They confessed to not praying. “I remember a man weeping in front of a church over not having sufficient faith that God would replace the job he had lost,” Luhrmann writes. The paradox of belief is one of Christianity’s “clearest” messages, she observes: “You may think you believe in God, but really you don’t. You don’t take God seriously enough. You don’t act as if he’s there.” It’s right out of Mark 9:24: “Lord, I believe; help my unbelief!”
The paradox of belief has been the subject of scholarly investigation; puzzling it out promises new insights about the human psyche. Some of the most influential work has been by the French philosopher and cognitive scientist Dan Sperber. Born into a Jewish family in France in 1942, during the Nazi Occupation, Sperber was smuggled to Switzerland when he was three months old. His parents returned to France three years later, and raised him as an atheist while imparting a respect for all religious-minded people, including his Hasidic Jewish ancestors.
The exercise of finding rationality in the seemingly irrational became an academic focus for Sperber in the nineteen-seventies. Staying with the Dorze people in southern Ethiopia, he noticed that they made assertions that they seemed both to believe and not to believe. People told him, for example, that “the leopard is a Christian animal who observes the fasts of the Ethiopian Orthodox Church.” Nevertheless, the average Dorze man guarded his livestock on fast days just as much as on other days. “Not because he suspects some leopards of being bad Christians,” Sperber wrote, “but because he takes it as true both that leopards fast and that they are always dangerous.”
Sperber concluded that there are two kinds of beliefs. The first he has called “factual” beliefs. Factual beliefs—such as the belief that chairs exist and that leopards are dangerous—guide behavior and tolerate little inconsistency; you can’t believe that leopards do and do not eat livestock. The second category he has called “symbolic” beliefs. These beliefs might feel genuine, but they’re cordoned off from action and expectation. We are, in turn, much more accepting of inconsistency when it comes to symbolic beliefs; we can believe, say, that God is all-powerful and good while allowing for the existence of evil and suffering.
In a masterly new book, “Religion as Make-Believe” (Harvard), Neil Van Leeuwen, a philosopher at Georgia State University, returns to Sperber’s ideas with notable rigor. He analyzes beliefs with a taxonomist’s care, classifying different types and identifying the properties that distinguish them. He proposes that humans represent and use factual beliefs differently from symbolic beliefs, which he terms “credences.” Factual beliefs are for modelling reality and behaving optimally within it. Because of their function in guiding action, they exhibit features like “involuntariness” (you can’t decide to adopt them) and “evidential vulnerability” (they respond to evidence). Symbolic beliefs, meanwhile, largely serve social ends, not epistemic ones, so we can hold them even in the face of contradictory evidence.
$= dv.el('center', 'Source: ' + dv.current().Link + ', ' + dv.current().Date.toLocaleString("fr-FR"))