Facebook Inc. said it will hire 3,000 more staffers to review content in an attempt to curb violent or sensitive videos on its site without scaling back its live-streaming tools.
The planned hires, announced by Chief Executive Mark Zuckerberg in a post Wednesday, are in response to the Facebook posting of such violent videos as one last month showing a Cleveland man fatally shooting another man. A week later, a man in Thailand killed his 11-month-old daughter in a live video.
Mr. Zuckerberg's proposed fix, which would increase Facebook's roster of 4,500 reviewers by two-thirds over the next year, addresses the amount of time it takes Facebook to remove graphic content, as opposed to preventing its site from being used to display such content. The Cleveland video was up for roughly two hours; the Thailand video stayed up for 24 hours.
"If we're going to build a safe community, we need to respond quickly," Mr. Zuckerberg wrote, adding that videos posted on Facebook of people hurting themselves and others in the past few weeks has been "heartbreaking."
Mr. Zuckerberg also said Facebook would make it easier for users to report problems to the company so reviewers can more quickly determine if a post violates its standards. The company is also investing in artificial intelligence, in hopes that AI can one day detect violence as it's unfolding, but that technology is a long way off.
Meanwhile, it could be a challenge for Facebook to ramp up its content moderation team to 7,500. Most reviewers are contractors, not full-time employees, and burnout is high, experts and former workers say. A small subset of workers manage live videos.
"It's an incredible commitment of resources. It took them almost 10 years to devote 4,500 jobs to doing this," said Kate Klonick, resident fellow at Yale Law School's Information Society Project and author of a recent paper on content moderation at technology companies. "The only thing I question is being able to maintain quality on content moderation review as they take on, train and update a system for this huge number of new workers."
A Facebook spokeswoman declined to say if the new hires would be contractors or employees, or where they would be hired. They will handle a variety of objectionable content, including hate speech and child exploitation, across text, images and video.
Facebook rushed out its live-streaming tool, Facebook Live, to users last year, with Mr. Zuckerberg touting the format as raw and visceral. But the company was not prepared for the extent to which crimes, including suicides and murder, shown on live video would capture the public's attention.
"We're working to make these videos easier to report so we can take the right action sooner, whether that's responding quickly when someone needs help or taking a post down," Mr. Zuckerberg said.
Write to Deepa Seetharaman at Deepa.Seetharaman@wsj.com and Joshua Jamerson at firstname.lastname@example.org
(END) Dow Jones Newswires
May 03, 2017 13:29 ET (17:29 GMT)