Facebook Expanding Efforts to Block False Information

Facebook Expanding Efforts to Block False Information
Depositphotos

Facebook said it’s rolling out a slew of new and expanded ways to rein in the spread of misinformation across its websites and apps, according to Bloomberg.

The company said that the Associated Press will expand its role as part of Facebook’s third-party fact-checking program. Facebook also will reduce the reach of Groups that repeatedly share misinformation, such as anti-vaccine views, make Group administrators more accountable for violating content standards and allow people to remove posts and comments from Facebook Groups even after they leave it.

Facebook’s executives for years have said they’re uncomfortable choosing what’s true and false. Under pressure from critics and lawmakers in the U.S. and elsewhere, especially since the flood of misinformation during the 2016 U.S. presidential campaign, the social media company with 2 billion users has been altering its algorithms and adding human moderators to combat false, extreme and violent content.

While Facebook has updated its policies and efforts, content that violates the company’s standards persists. Most recently, the social network was criticized for not quickly removing the video of the mass shooting in New Zealand that was live streamed.