Can you tell which of the following statements are true and which are false?
COVID-19 is not a threat to younger people, and only those who have other medical conditions are dying from it.
The mRNA vaccines developed to prevent the coronavirus alter your genes, can make your body “magnetic,” and are killing more people than the virus itself.
President Joe Biden’s climate change plan calls for a ban on meat consumption to cut greenhouse gas emissions.
The 2020 presidential election was rigged and stolen.
If you guessed that all of these claims are false, you’re right ― take a bow. Not a single one of these statements has any factual support, according to scientific research, legal rulings, and legitimate government authorities.
And yet public opinion surveys show millions of Americans, and others around the world, believe some of these falsehoods are true and can’t be convinced otherwise.
Social media, politicians and partisan websites, TV programs, and commentators have widely circulated these and other unfounded claims so frequently that many people say they simply can’t tell what’s objectively true and not anymore.
So much so, the authors of a fascinating new research study have concluded we are living in a “post-truth era,” with baseless beliefs and subjective opinions given a higher priority than verifiable facts.
The new study ― The Rise and Fall of Rationality in Language, published in the Proceedings of the National Academy of Sciences ― found that facts have become less important in public discourse.
As a result, unsupported beliefs have taken precedent over readily identifiable truths in discussions of health, science, and politics. The upshot: “Feelings trump facts” in social media, news reports, books, and other sources of information.
And here’s the kicker: The trend did not begin with the rise of former President Donald Trump, the COVID-19 pandemic, or the advent of social media; in fact, it has been growing for much longer than you might think.
“While the current ‘post-truth era’ has taken many by surprise, the study shows that over the past 40 years, public interest has undergone an accelerating shift from the collective to the individual, and from rationality towards emotion,” concluded the researchers from Indiana University and Wageningen University & Research (WUR) in the Netherlands.
“Our work suggests that the societal balance between emotion and reason has shifted back to what it used to be around 150 years ago,” says lead researcher Marten Scheffer, PhD, a professor in the Department of Environmental Sciences at WUR. “This implies that scientists, experts, and policymakers will have to think about the best way to respond to that social change.”
Researchers Surprised by Findings
The findings are based on a very detailed analysis of language from millions of books, newspaper articles, Google searches, TV reports, social media posts, and other sources dating back to 1850.
The researchers analyzed how often the 5,000 most used words appeared over the past 170 years and found that the use of those having to do with facts and reasoning, such as “determine” and “conclusion,” has fallen dramatically since 1980. Meanwhile, the use of words related to human emotion, such as “feel” and “believe,” have skyrocketed.
Scheffer notes rapid developments in science and technology from 1850 to 1980 had profound social and economic benefits that helped boost the status of the scientific approach. That shift in public attitudes had ripple effects on culture, society, education, politics, and religion ― and “the role of spiritualism dwindled” in the modern world, he says.
But since 1980, that trend has seen a major reversal, with beliefs becoming more important than facts to many people, he says. At the same time, trust in science and scientists has fallen.
Scheffer says the researchers expected to find some evidence of a swing toward more belief-based sentiments during the Trump era but were surprised to discover how strong it is and that the trend has actually been a long time coming.
“The shift in interest from rational to intuitive/emotional is pretty obvious now in the post-truth political and social media discussion,” he says. “However, our work shows that it already started in the 1980s. For me personally, that went under the radar, except perhaps for the rise of alternative (to religion) forms of spirituality.
“We were especially struck by how strong the patterns are and how universal they appear across languages, nonfiction and fiction, and even in The New York Times.”
In the political world, the implications are significant enough ― impacting policies and politicians on both sides of the aisle and across the globe. Just look at the deepening political divisions during the Trump presidency.
But for health and science, the spread of misinformation and falsehoods can be matters of life or death, as we have seen in the politically charged debates over how best to combat COVID-19 and global climate change.
“Our public debate seems increasingly driven by what people want to be true rather than what is actually true. As a scientist, that worries me,” says study co-author Johan Bollen, PhD, a professor of informatics at Indiana University.
“As a society, we are now faced with major collective problems that we need to approach from a pragmatic, rational, and objective perspective to be successful,” he says. “After all, global warming doesn’t care about whether you believe in it or not … but we will all suffer as a society if we fail to take adequate measures.”
For WUR co-researcher Ingrid van de Leemput, the trend isn’t merely academic; she’s seen it play out in her personal life.
“I do speak to people that, for instance, think the vaccines are poison,” she says. “I’m also on Twitter, and there, I’m every day surprised about how easily many people form their opinions, based on feelings, on what others say, or on some unfounded source.”
Public health experts say the embrace of personal beliefs over facts is one reason only 63% of Americans have been vaccinated against COVID-19. The result: millions of preventable infections among those who downplay the risks of the virus and reject the strong scientific evidence of vaccine safety and effectiveness.
“None of this really surprises me,” Johns Hopkins University social and behavioral scientist Rupali Limaye, PhD, says of the new study findings. Limaye co-authored a paper in 2016 in JAMA Pediatrics about how to talk to parents about vaccine hesitancy and the fact that we’re living in what they called “this post-truth era.”
Limaye says the trend has made it difficult for doctors, scientists, and health authorities to make fact-based arguments for COVID-19 vaccination, mask-wearing, social distancing, and other measures to control the virus.
“It’s been really hard being a scientist to hear people say, ‘Well, that’s not true’ when we say something very basic that I think all of us can agree on ― like the grass is green,” she says. “To be honest, I worry that a lot of scientists are going to quit being in science because they’re exhausted.”
What’s Driving the Trend?
So, what’s behind the embrace of “alternative facts,” as former White House counselor Kellyanne Conway put it so brazenly in 2017, in defending the White House’s false claims that Trump’s inauguration crowd was the largest ever?
Scheffer and colleagues identified a handful of things that have encouraged the embrace of falsehoods over facts in recent years.
The internet: Its rise in the late 1980s, and its growing role as a primary source of news and information, has allowed more belief-based misinformation to flourish and spread like wildfire.
Social media: The new study found the use of sentiment- and intuition-related words accelerated around 2007, along with a global surge in social media that catapulted Facebook, Twitter, and others into the mainstream, replacing more traditional fact-based media (i.e., newspapers and magazines).
The 2007 financial crisis: The downturn in the global economy meant more people were dealing with job stress, investment losses, and other problems that fed the interest in belief-based, anti-establishment social media posts.
Conspiracy theories: Falsehoods involving hidden political agendas, shadow “elites,” and wealthy people with dark motives tend to thrive during times of crisis and societal anxiety. “Conspiracy theories originate particularly in times of uncertainty and crisis and generally depict established institutions as hiding the truth and sustaining an unfair situation,” the researchers noted. “As a result, they may find fertile grounds on social media platforms promulgating a sense of unfairness, subsequently feeding anti-system sentiments.”
Scheffer says that growing political divisions during the Trump era have widened the fact-vs.-fiction divide. The ex-president voiced many anti-science views on global climate change, for instance, and spread so many falsehoods about COVID-19 and the 2020 election that Facebook, Twitter, and YouTube suspended his accounts.
Yet Trump remains a popular figure among Republicans, with most saying in a December poll they believe his baseless claims that the 2020 election was “rigged” and “stolen,” despite all credible, easily accessible evidence that it was secure, according to a recent poll by the University of Massachusetts at Amherst.
More than 60 courts have rejected Trump’s lawsuits seeking to overturn the election results. All 50 states, the District of Columbia, and both branches of Congress have certified the election results, giving Biden the White House. Even Trump’s own Justice Department confirmed that the 2020 election was free and fair.
Nevertheless, the University of Massachusetts survey found that most Republicans believe one or more conspiracy theories floated by the former president and those pushing his “big lie” that Democrats rigged the election to elect Biden.
Ed Berliner, an Emmy Award-winning broadcast journalist and media consultant, suggests something else is driving the spread of misinformation: the pursuit of ratings by cable TV and media companies to boost ad and subscriber revenues.
As a former executive producer and syndicated cable TV show host, he says he has seen firsthand how facts are often lost in opinion-driven news programs, even on network programs claiming to offer “fair and balanced” journalism.
“Propaganda is the new currency in America, and those who do not fight back against it are doomed to be overrun by the misinformation,” says Berliner, host of The Man in the Arena and CEO of Entourage Media LLC.
“The broadcast news media has to stop this incessant ‘infotainment’ prattle, stop trying to nuzzle up to a soft side, and bear down on hard facts, exposing the lies and refusing to back down.”
Public Health Implications
Public health and media experts alike say the PNAS study findings are disheartening but underscore the need for doctors and scientists to do a better job of communicating about COVID-19 and other pressing issues.
Limaye, from Johns Hopkins, is particularly concerned about the rise in conspiracy theories that has led to COVID-19 vaccine hesitancy.
“When we speak to individuals about getting the COVID vaccine … the types of concerns that come up now are very different than they were 8 years ago,” she says. “The comments we used to hear were much more related to vaccine safety. [People] would say, ‘I’m worried about an ingredient in the vaccine’ or ‘I’m worried that my kiddo has to get three different shots within 6 months to have a series dose completed.'”
But now, a lot of comments they receive are about government and pharma conspiracies.
What that means is doctors and scientists must do more than simply say “here are the facts” and “trust me, I’m a doctor or a scientist,” she says. And these approaches don’t only apply to public health.
“It’s funny, because when we talk to climate change scientists, as vaccine [specialists], we’ll say we can’t believe that people think COVID is a hoax,” she says. “And they’re like, ‘Hold my beer, we’ve been dealing with this for 20 years. Hello, it’s just your guys’ turn to deal with this public denial of science.'”
Limaye is also concerned about the impacts on funding for scientific research.
“There’s always been a really strong bipartisan effort with regards to funding for science, when you look at Congress and when you look at appropriations,” she says. “But what ended up happening, especially with the Trump administration, was that there was a real shift in that. We’ve never really seen that before in past generations.”
So, what’s the big take-home message?
Limaye believes doctors and public health experts must show more empathy ― and not be combative or arrogant ― in communicating science in one-on-one conversations. This month, she’s launching a new course for parents, school administrators, and nurses on how to do precisely that.
“It’s really all about how to have hard conversations with people who might be anti-science,” she says. “It’s being empathetic and not being dismissive. But it’s hard work, and I think a lot of people are just not cut out for it and just don’t have the time for it. … You can’t just say, ‘Well, this is science, and I’m a doctor’ ― that doesn’t work anymore.”
Brendan Nyhan, PhD, a Dartmouth College political scientist, echoes those sentiments in a separate paper recently published in the Proceedings of the National Academy of Sciences. In fact, he suggests that providing accurate, fact-based information to counter false claims may actually backfire and reinforce some people’s unfounded beliefs.
“One response to the prevalence of mistaken beliefs is to try to set the record straight by providing accurate information ― for instance, by providing evidence of the scientific consensus on climate change,” he writes. “The failures of this approach, which is sometimes referred to as the ‘deficit model’ in science communication, are well-known.”
Nyhan argues two things make some people more prone to believe falsehoods:
What scientists call “ingrouping,” a kind of tribal mentality that makes some people choose social identity or politics over truth-seeking and demonize others who don’t agree with their views
The rise of high-profile political figures, such as Trump, who encourage their followers to indulge in their desire for “identify-affirming misinformation”
Scheffer, from Wageningen University & Research, says the most important thing for doctors, health experts, and scientists to recognize is that it’s crucial to gain the trust of someone who may believe fictions over facts to make any persuasive argument on COVID-19 or any other issue.
He also has a standard response to those who present falsehoods to him as facts that he suggests anyone can use: “That is interesting. Would you mind helping me understand how you came to that opinion?”
Marten Scheffer, PhD, professor, Wageningen University & Research, the Netherlands.
Ingrid van de Leemput, assistant professor, Wageningen University & Research, the Netherlands.
Johan Bollen, PhD, professor, Indiana University.
Rupali Limaye, PhD, social and behavioral scientist, Johns Hopkins University.
Ed Berliner, broadcast journalist, media consultant; host, The Man in the Arena; CEO, Entourage Media LLC.
Proceedings of the National Academy of Sciences: “The Rise and Fall of Rationality in Language,” “Why the backfire effect does not explain the durability of political misperceptions.”
Eurekalert.org: “‘We conclude’ or ‘I believe’? Rationality declined decades ago.”
NBC News: “Conway: Press Secretary Gave ‘Alternative Facts'”
The Associated Press: “Disputing Trump, Barr says no widespread election fraud.”
The New York Times: “‘Belonging Is Stronger Than Facts’: The Age of Misinformation,” “See How Vaccinations Are Going in Your County and State,” “The Problem With ‘Self-Investigation’ in a Post-Truth Era,” “Chris Hayes Reviews Michiko Kakutani’s Book About Our Post-Truth Era.”
The Washington Post: “Biden’s climate plan doesn’t ban meat. But baseless claims left Republicans fuming: ‘Stay out of my kitchen.'”
Cornell University: “How to spot fake news.”
Ralph Keyes: “The Post-Truth Era: Dishonesty and Deception in Contemporary Life.”
The Chronicle of Higher Education: “The Post-Truth Issue.”