Why Are We Still Falling for “Fake News”?

Eni Mustafaraj gives a talk
August 10, 2017

Since the phenomenon of so-called fake news first captured public attention, the term has evolved from describing websites that deliberately push false stories, hoaxes, and conspiracy theories to including nearly any claim of doubtful nature.

Eni Mustafaraj, assistant professor of computer science at Wellesley, along with P. Takis Metaxas, professor of computer science, has been studying online misinformation, investigating everything from Google bombs to Twitter bots and Facebook algorithms, since 2008. They’ve spoken to the media extensively about the trend, and they are also using the topic as a springboard for discussing biases, the dangers of echo chambers, and the need to cultivate and support critical thinking.

Mustafaraj recently published a piece in Medium.com asserting that social networks such as Twitter and Facebook have a responsibility to warn their users about potentially dubious news stories. In “The fake news story that fooled even Maggie Haberman,” she addressed an incident involving the New York Times political reporter. Haberman apparently fell for a story published on June 30 by what was later found to be a fake news site, the Jackson Telegraph, that claimed a mass grave of dozens of tortured black men were found on a deceased KKK leader’s estate. Haberman tweeted about this even after the Snopes.com fact-checking site had debunked the story.

In the piece, Mustafaraj wrote, “Twitter, differently from ‘post-2016 election’ Facebook, has yet to start collaborating with fact checkers to surface such useful information to its overwhelmed users.” (On August 3, Facebook announced that it will be rolling out the feature of related articles to fight fake news.) However, she went on to say that relying on third-party fact-checking might already be too late—Snopes’ rebuttal came two days after the story had gone viral on Facebook and Twitter. “After two days,” Mustafaraj wrote, “most people have already moved on to other stories.”

A better strategy, she said, is immediately following the story with a link to a trusted source that contradicts the false information. She referenced an experiment by Georgetown professor Leticia Bode. However, Mustafaraj pointed out that this works when false information has been around for a long time: “For example: no, vaccines don’t cause autism, or no, climate change is not a hoax.” 

When a fact-checking site for a story does not exist yet, Mustafaraj proposed that “the platforms must surface useful signals that make it easy for us to verify the credibility of a web source before we start spreading information.” Previously, Mustafaraj had written about the idea of “nutrition fact labels” in an article exploring more signals to display to users at the same time they see a story.

Mustafaraj said that she and Metaxas found in 2010 that an easy way to check an unknown website is to verify its domain registration credentials and look at the timing of registration and entities involved. She said that if readers had done this before the fake KKK story went viral, they would have found that the website, JackonTelegraph.com, had been registered only one week before that story was published.

Mustafaraj said that checking the origin of a website’s domain takes time, and most web users might not even know that it’s possible or what it means. But, she said, “for the algorithms powering Twitter and Facebook, this check costs a fraction of a millisecond. It should be their responsibility to display to users information that warns about the credibility of a web source.”

Her research vision, she explained, is to provide users with tools to strengthen their critical thinking skills so that when they consume information online, they’re fully aware of its provenance, and they don’t confuse the platform delivering the information (i.e., Twitter, Google, Facebook) with its originator (real—or fake—media sources)

Mustafaraj has also been teaching her students about the fake news phenomenon. Two of them, Emma Lurie ’19 and Khonzoda Umarova ’20, completed Wellesley’s Summer Research Program in July. At the program's culminating poster session, they won an award for best computer science/math poster for “Stop falling for fake news: Three Easy Steps.”