Facebook has already been tackling the spread of false news across its network for the last year and a half, and now they are upping the ante, introducing a series of updates designed to fight fake news. This comprises a combination of technology and human review, including removal of fake accounts, partnership with fact-checkers, and promoting news literacy.
In a recent blog post, Facebook explained the scale of this initiative and some of the actions they are taking to increase the impact of fact-checking by using new techniques:
“With more than a billion pieces of content posted every day, we know that fact-checkers can’t review every story one-by-one. So, we are looking into new ways to identify false news and take action on a bigger scale.”
Some of the updates as part of this work include:
- Expanding the Facebook fact-checking program to new countries
- Expanding Facebook’s test to fact-check photos and videos
- Increasing the impact of fact-checking by using new techniques, including identifying duplicates and using Claim Review
- Taking action against new kinds of repeat offenders
- Improving measurement and transparency by partnering with academics
According to Tessa Lyons, Product Manager at Facebook, as part of their third-party fact-checking program, “certified, independent fact-checkers rate the accuracy of stories on Facebook, helping reduce the distribution of stories rated as false by an average of 80%.”
The clip above is from “Facing Facts,” a short film that provides a behind-the-scenes look at Facebook’s fight against the spread of misinformation. This film was made in collaboration with documentary filmmaker Morgan Neville, to reveal how Facebook is thinking about this complex problem and marshaling forces against it.
“We wanted to try something different with this project,” says John Hegeman, head of Facebook News Feed. “The challenges the News Feed team faces are complex, but it’s critical that people outside the company understand what we’re doing and why. So we need to keep trying new, different ways to give people that context.”
We’re doing everything we can to fight this. 99% isn’t good enough.
This short film (11 minutes) acknowledges Facebook’s role in spreading fake news and the fact that it has come under heavy scrutiny, going on to explore what the company has done about the problem, and how the fight is going.
Facebook has also created an “Investigative Operations Team”, comprising ex-intelligence officers, researchers, and media buyers, and tasked them with finding the worst possible things that can be done using their platform, and to help the company prevent them. This will help Facebook spot problems before they arise and preclude future crises before they happen.
“As we double down on countering misinformation, our adversaries are going to keep trying to get around us,” says Tessa Lyons, Product Manager at Facebook. “We need to stay ahead of them, and we can’t do this alone. We’re working with our AI research team, learning from academics, expanding our partnerships with third-party fact-checkers, and talking to other organizations — including other platforms — about how we can work together.”
False news is bad for people and bad for Facebook. We’re making significant investments to stop it from spreading and to promote high-quality journalism and news literacy.
For genuine publishers who have been swamped out of Facebook’s News Feed thanks to the rampant spread of false news, this ramped up initiative is indeed a silver lining on a cloud of fake news.