Go to content, skip over navigation

Sections

More Pages

Go to content, skip over visible header bar
Home News Features Arts & Entertainment Sports OpinionAbout Contact Advertise

Note about Unsupported Devices:

You seem to be browsing on a screen size, browser, or device that this website cannot support. Some things might look and act a little weird.

Stopping the spread: Professor Adam Berinsky discusses political rumors

September 27, 2024

Isa Cruz
FIGHTING FALSEHOODS: Adam Berinsky talks about the increasing impact of misinformation on politics and how it can be countered. This event was the second in a series of lectures organized by the Office of Inclusion and Diversity.

Politically-minded members of the Bowdoin and greater Brunswick community braved dark skies and pouring rain on Thursday afternoon to hear Adam Berinsky, a professor of political science at the Massachusetts Institute of Technology, give a talk in Kresge Auditorium on a similarly gloomy topic: the political misinformation crisis that has become widespread over the last decade.

Berinsky’s lecture, “Why We Accept Misinformation and How to Fight It,” is the second in a series hosted by the Office of Inclusion and Diversity, which aims to build awareness of the current social and political context ahead of the upcoming presidential election. The previous lecture featured Pulitzer Prize-winning historian Eric Foner who spoke about the lasting impacts of Reconstruction. Three more events are scheduled for the coming weeks.

After an introduction from Senior Vice President for Inclusion and Diversity Benje Douglas, who mentioned Berinsky’s previous work in educating organizations like Google, Facebook and the Department of Homeland Security on the dangers of misinformation, Berinsky began his lecture by highlighting the overall lack of trust in institutions and experts that pervades present day America.

“Individuals and groups that should have that authority aren’t necessarily given that authority,” Berinsky said. “We’re in a time when people can construct their own reality, when people can think about their own facts and come to their own decisions.”

Berinsky spent the majority of the talk addressing two main concerns: why and how misinformation has become so widespread and what potential strategies exist for limiting its impacts on public opinion.

“What my research is trying to get at is … in a world in which we don’t have authority, how can we stop or slow the spread of misinformation? How can we deal with the fact that the information environment is saturated with things that are not supported by the facts?” Berinsky said.

Although he had already been studying public opinion for over a decade, Berinsky mentioned that his interest in political rumors began in 2009 with the rising popularity of rumors such as the debunked “birther” allegations surrounding President Obama’s citizenship.

Berinsky said his early research showed that attempts to fact-check rumors, such as President Obama releasing his birth certificate, often did little to shift public opinion in the long run.

“Even if we counter this rumor with a fact, even if you produce some evidence to support a claim, there’s still a lot of people who question [its validity],” Berinsky said. “So as a political scientist, I was interested in trying to think about … how can we understand these beliefs, and what can we do to change them?”

A series of experiments that Berinsky conducted showed that, although it was difficult to change the minds of people who believed a political rumor, people who were already unsure about their misinformed belief were much more open to changing their minds.

“If we focus on just the believers, we’re going to get really depressed. It’s really hard to move those people, but there are some people we can move—those uncertain folks,” Berinsky said. “There is a group of people that we can focus our efforts on.”

In the second half of the lecture, Berinsky focused on finding potential strategies to stop the spread of misinformation—which, he admitted, proved much more difficult than merely identifying why people believe political misinformation.

While fact-checkers or experts in a given field may be important in identifying false information, Berinsky mentioned that they might not always be the best tool for stopping its spread, especially among communities who already have a distrust of authority figures.

“I think it’s really important to think … not just about the credibility [that someone] brings to the table, but finding the right messenger,” Berinsky said.

For example, Berinsky found that using right-wing politicians to debunk misinformation largely believed by Republicans worked the most effectively. Similarly, left-wing politicians were best at speaking to Democrats.

Other strategies that Berinsky mentioned included offering media literacy tips, “prebunking” rumors before they catch on and, of course, fact-checking. But after running a number of experiments, Berinsky concluded that no single tactic was completely effective by itself.

“What can we do in a world where we can affect things only marginally? Do we give up?” Berinsky said. “There’s no magic solution…. We don’t have a single best solution, so we really need to think about how we can bundle these together.”

Alexander Nicholas ’28, who attended the talk to better inform himself ahead of the upcoming elections, said that he walked away with a stronger sense of what goes into people’s political beliefs.

“It [was interesting] when he broke it down into some psychological principles,” Nicholas said. “That helped me get a general sense of how Americans are thinking.”

Comments

Before submitting a comment, please review our comment policy. Some key points from the policy:

  • No hate speech, profanity, disrespectful or threatening comments.
  • No personal attacks on reporters.
  • Comments must be under 200 words.
  • You are strongly encouraged to use a real name or identifier ("Class of '92").
  • Any comments made with an email address that does not belong to you will get removed.

Leave a Reply

Any comments that do not follow the policy will not be published.

0/200 words