- In late April, the two computer science majors built a Facebook Messenger bot that, when fed a link, will tell you whether the article in question is or isn’t “fake news.”
- Bhat, who has built civic tech tools before in his spare time, thought that perhaps they could use machine learning to build a bot that would help put articles people find on Facebook in context.
- To teach the algorithm to recognize right-leaning content they fed it thousands of articles from Breitbart, a hyperconservative news website.
- NewsBot has also begun offering short news summaries of top articles.
- Right now you can mark an article as fake news from a small drop down at the top, but if you’re a user just scrolling, the feed hasn’t really changed in any way.
This could change how you read the news.
@mic: 2 college students built a tool to fight fake news on Facebook using artificial intelligence
In the past year, fake news has become a rampant problem on Facebook.
The spread of misinformation is so broad and nebulous that it has proven challenging for the company to contain. After initially denying that the platform had a fake news problem, Mark Zuckerberg vowed in April to address the problem head-on. Its solution was to bring in third-party fact-checkers to help vet information and add warning labels to potentially false news stories. So far, however, the system hasn’t had a large impact.
Meanwhile, two Berkeley college students, Ash Bhat, 20, and Rohan Phadte, 19, have taken things into their own hands.
In late April, the two computer science majors built a Facebook Messenger bot that, when fed a link, will tell you whether the article in question is or isn’t “fake news.”
The Messenger bot, named NewsBot, took them only a few weeks to build, yet it’s one of the only tools of its kind. In addition to sussing out the validity of an article, it also offers a barometer showing whether the article is deemed biased toward the left or right.
How NewsBot came together
Bhat and Phadte were both taking machine-learning classes at Berkeley this year when the idea for the tool first came to them.
“I see tech as having this responsibility in terms of giving people the tools to be more informed,” Bhat said in an interview. He and his friends, many of whom are very politically active, watched the spread of false information proliferate on Facebook throughout the 2016 election and were troubled by what they saw.
Bhat, who has built civic tech tools before in his spare time, thought that perhaps they could use machine learning to build a bot that would help put articles people find on Facebook in context.
He and Phadte set to work building an algorithm.
In order to train the algorithm that the bot runs on they fed it over 10,000 articles from around the web. The first and largest batch of articles were sourced from sites that skew far on either end of the spectrum.
To teach the algorithm to recognize right-leaning content they fed it thousands of articles from Breitbart, a hyperconservative news website. Then they fed it articles from BlueDotDaily, to teach it to recognize the opposite.
Soon the bot could determine the biases of articles from all types of sites in the middle. The more it’s used, the more accurate it becomes.
But while the students have built a useful tool, it’s not perfect. “We’re still making updates and changes almost every day,” Bhat said.
When fed certain racially charged articles from Breitbart, for instance, the algorithm still marked them as neutral or unbiased. Other stories have been marked as far more left or right leaning than they actually are simply based on the wording of one or two sentences.
But Bhat said that these wrinkles will be sorted out in time.
“The bot does have a harder time with shorter articles right now,” he admitted, saying that there are fewer sentences to pull from in a short article.
Bhat also said that the tool would only improve as more users use it and provide feedback.
NewsBot has also begun offering short news summaries of top articles. These summaries, written by machine learning, pull out relevant information and provide it to users within the chat window.
Since they’re written by a robot, some of the summaries can read a little strangely, yet are still helpful if you’re looking for a broad overview.
“We want to get people more informed and make decisions based on their views, as opposed to just being politically biased by what they read,” Bhat said.
“We want people to read more than just the headline. We want them to understand what the news they see says and who it’s coming from.”
It’s an uphill challenge, but the two remain unfazed. Despite finals looming over his head, Bhat spent hours working on NewsBot in recent weeks, each time pushing updates to improve the product bit by bit.
Soon, they hope the bot will be able to recognize and categorize opinion articles and commentary, in addition to other features.
Bhat would also like to see more from Facebook itself. He said that he wishes the platform would drop its facade of an impartial tech company and start verifying legitimate news sources.
“Facebook should be proactive and make it visible that they’re fighting fake news. Right now you can mark an article as fake news from a small drop down at the top, but if you’re a user just scrolling, the feed hasn’t really changed in any way. Facebook owes it to news organizations to bring trust back,” he said.
“There’s a good portion of the country that thinks even very valid and legitimate news outlets like the New York Times and Washington Post are fake news. That’s a problem.”
In the meantime, Bhat said he hopes that NewsBot can be a stop gap. He said he hopes the tool will give users the confidence to navigate an ever-changing media climate.
“I want people to become more informed and recognize the biases in what they’re reading,” Bhat said.
“Information is one of the most valuable things and every American has a right to be informed. I just want to make that very, very easy.”