Facebook CEO Mark Zuckerberg said Wednesday that the company needs to do a better job of flagging and removing inappropriate posts and videos on the service.
The company’s long-term solution is to build AI technology to automatically detect violent or inappropriate posts and pull them down — or stop them from going up in the first place.
The short-term solution: A lot more bodies.
Zuckerberg posted Wednesday that Facebook is hiring another 3,000 content moderators to “review the millions of reports we get every week, and improve the process for doing it quickly.” Facebook already has 4,500 people around the world working on this, he added. Facebook has over 17,000 employees.
The move is in response to a spate of violent videos that have been shared to Facebook in the past month, including a murder and live confession in Cleveland that garnered a lot of attention in mid-April.
Zuckerberg didn’t mention the Cleveland video specifically on Wednesday, but it was clear that incident was a catalyst for the 3,000-person expansion.
“We've seen people hurting themselves and others on Facebook — either live or in video posted later,” Zuckerberg wrote. “It's heartbreaking, and I've been reflecting on how we can do better for our community.”
Update: In the company’s Q1 earnings call later in the day Wednesday, Zuckerberg addressed his plan to hire more content moderators, saying that Facebook users report “millions” of posts per week to the company, and that 20 percent of all videos uploaded to Facebook are live videos.
“No matter how many people we have on the team, we’ll ever be able to look at everything,” Zuckerberg said.
He also talked about his plans to build AI software to automate some of this content moderation. “That will take a period of years though to really reach the quality level that we want,” he added.
This article originally appeared on Recode.net.