Facebook Hiring 3,000 People to Monitor Live Video for Violence

Facebook Hiring 3,000 People to Monitor Live Video for Violence

Facebook will hire 3,000 people around the world to monitor videos and posts for violent or criminal acts, and potentially prevent tragedies from occurring, according to Bloomberg.

The social-media site has faced calls to do more, and respond faster, after a murder and a suicide were recently shown live. The new employees, who will be added over the next year, will join 4,500 people already on Facebook’s content moderation force.

The problem will eventually be solved when computers can reliably determine the content and context of video. For now, a human touch is needed, CEOMark Zuckerberg wrote in a Facebook post.

“If we’re going to build a safe community, we need to respond quickly,“ Zuckerberg wrote. “We’re working to make these videos easier to report so we can take the right action sooner, whether that’s responding quickly when someone needs help or taking a post down.“

Facebook doesn’t just want to remove disturbing videos. It wants to prevent the violence from occurring and make people safer, Zuckerberg said. Earlier this year, the CEO wrote a letter to users pledging to will take responsibility for Facebook’s impact on its community of almost 2 billion users, whether through the spread of misinformation or civic engagement. That’s a shift from Facebook’s earlier stance of being a neutral platform for its content.

“This is important,“ Zuckerberg said in a recent post. “Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.“