As journalists and professional fact-checkers struggle to keep up with the deluge of misinformation online, fact-checking sites that rely on loosely coordinated contributions from volunteers, such as Wikipedia, can help fill the gaps, Cornell research finds.

In a new study, Andy Zhao, a doctoral candidate in information science based at Cornell Tech, compared professional fact-checking articles to posts on Cofacts, a community-sourced fact-checking platform in Taiwan. He found that the crowdsourced site often responded to queries more rapidly than professionals and handled a different range of issues across platforms.

“Fact-checking is a core component of being able to use our information ecosystem in a way that supports trustworthy information,” said senior author Mor Naaman, professor of information science at the Jacobs Technion-Cornell Institute at Cornell Tech and the Cornell Ann S. Bowers College of Computing and Information Science. “Places of knowledge production, like Wikipedia and Cofacts, have proved so far to be the most robust to misinformation campaigns.”

The study, “Insights from a Comparative Study on the Variety, Velocity, Veracity, and Viability of Crowdsourced and Professional Fact-Checking Services,” published Sept. 21 in the Journal of Online Trust and Safety.

Keep reading at news.cornell.edu.