College students come up with plug-in to combat fake news
A team of college students is getting attention from internet companies and Congress after developing a browser extension that alerts users to fake and biased news stories and helps guide them to more balanced coverage.
The plug-in, "Open Mind ," was developed earlier this month during a 36-hour problem-solving competition known as a hackathon at Yale University.
The winning team was comprised of four students: Michael Lopez-Brau and Stefan Uddenberg, both doctoral students in Yale's psychology department; Alex Cui, an undergraduate who studies machine learning at the California Institute of Technology; and Jeff An, who studies computer science at the University of Waterloo and business at Wilfrid Laurier University in Ontario.
That team competed against others to win a challenge from Yale's Poynter Fellowship in Journalism, which asked students to find a way to counter fake news.
The team's software, designed as an extension for Google's Chrome browser, will display a warning screen when someone enters a site known to disseminate fake news. It also will alert a reader if a story shared on social media is fake or biased.
But it does much more than just warn.
The plug-in uses existing sentiment analysis technology to analyze any story that might appear in a newsfeed, identifying the major players and any political slant. It then can suggest to the reader other stories on the same topic that have an alternate viewpoint.
"So let's say there is an article that is very pro-Trump on a topic," said An. "We would then try to give you something more left of center. We can go out and find for you that alternative article."
The extension also collects browsing data and can show a user a graph that indicates whether they have been reading stories from just one side of a political spectrum. It curates a news feed for that user, showing alternative stories to the ones they have been reading.
The idea, said Lopez-Brau, is to help get people out of the habit of associating on social media only with people who share their viewpoints and reading biased news coverage skewed toward their beliefs.
"Social media sites grow bubbles," said Lopez-Brau. "They make it extremely easy for people to only follow people with similar interests, so often there is no real opportunity for them to be confronted with an opposing viewpoint. They've allowed us to silo people off at a distance."
The team's prize for winning the challenge will be a meeting this spring with members of Congress.
Facebook, which was one of the sponsors of Yale's hackathon, also is interested in talking to the students as part of its ongoing work to solve the same problem, said Ruchika Budhraja, a Facebook spokeswoman.
"We're building products, many of which are very similar to what the students came up with at Yale," said Budhraja. "We have something called "Related Articles," which helps people discover articles on the same topic when they share an article."
The two Yale students plan to create a research project using the extension, tracking the browsing history of volunteers to try and determine if the plug-in actually changes browsing habits.
"The solution is not to just tell people if something is fake or not," said Cui. "The solution is to develop a kind of a news auto-immune system."