Growing up in Rome, Filippo Menczer used to watch the local con artists offer gullible tourists a chance to buy the Coliseum. The scam worked often enough that it spread and other people began doing it, until a combination of police action and human intelligence defeated it. (Well, at least no one tried to sell me the Coliseum when I visited a few years ago.)
Decades later, Menczer is focused on a different kind of propagation: how information, including misinformation, spreads over social networks. He is a professor of informatics and computer science and the director of the Center for Complex Networks and Systems Research at the Indiana University School of Informatics and Computing. He’s also one of the principal investigators of Truthy , “a system to analyze and visualize the diffusion of information on Twitter.”
The Truthy team is examining how information and memes propagate on Twitter. Truthy achieved a measure of fame for spotting political astroturf during the 2010 mid-term elections. (Watch this great video report by The Wall Street Journal to learn more about the basics of the project.)
Its initial work identifying astroturf campaigns provided an indication of how Truthy and projects like it could provide valuable intelligence to journalists. I recently wrote about how Storyful is in some ways providing an outsourced social media verification service for news organizations. Truthy’s focus on information diffusion has the potential to provide critical insight into the characteristics of misinformation and how it spreads. Perhaps one day it can help us detect misinformation before it spreads online, thus avoiding errors, hoaxes, and other falsehoods.
“Our project is meant in general to look at how information is spread online, not just misinformation,” Menczer said. “But that is part of the picture: [if you] understand what a normal pattern is then it can help you also understand what are patterns that may indicate abuse or something that is not normal, or some misinformation or what we call astroturf or spam.”
One barrier to obtaining a clear picture of misinformation is the ever-morphing nature of information and the evolving way we use networks such as social media.
“It’s an arms race, absolutely,” Menczer told me, referring to the challenge of staying one step ahead of the scammers and liars.
Part of the difficulty is that misinformation can spread the same way as good information. News organizations now work to perfect the sharing of their work over social networks. A similar impulse drives people on the other side of the information dissemination equation—those trying to push out a hoax, spread spam, or infect users with an “I saw a real bad blog abut you” phishing attack on Twitter. The same techniques that help make a good meme also provide a playbook for those seeking to game our attention and networks. An added complication is the challenge of tracking things back to their source, according to Menczer.
“It’s very difficult to distinguish these kinds of patterns from something genuine, especially after a while,” Menczer said. “If people fall for it then they start retweeting as well and then the trace of its fakeness [becomes] buried in the initial moments. We’ve observed this as well, and so we are focused on trying to make this kind of detection very early on because after a while, if something takes off, whether it’s fake or not, it’s hard to tell.”
That means early detection is a must-have for any misinformation detection system.
“Once it’s exploded it’s very hard to beat back,” Menczer said. “ You might later be able to say ‘Oh, that was fake’, but very few people may see that.”
He said the hope is to eventually develop a platform that can be used by journalists and the public to track suspicious memes and information, determine the source, and help evaluate the accuracy. For now, he and the team of researchers in Indiana are focused on tracking the spread of information on Twitter. But the ideal platform would cover a variety of social networks and the web.
That sounds daunting, and Menczer said one of the biggest, messiest, and most complex parts of this problem is us. The human element.
“People are the most complex things that we study,” Menczer said. “We have forecast systems for the weather and we can study subatomic particles and we can study galaxies. Once we understand the physics of it, we have something to go by. And here, there is no physics.”
During our discussion, Menczer identified several elements that could form the basis of a system, though it’s of course a moving target. For now, he said a misinformation detection and debunking system would combine: