Nearly two months after a mass shooting at a mosque in New Zealand that killed 51 people was livestreamed on Facebook, the social media giant announced its plan to enforce stricter rules surrounding the service.
Continue Reading Below
In a blog post, the company's Vice President of Integrity Guy Rosen said after the horrific terrorist attack, it has been reviewing how to limit the service from being "used to cause harm or spread hate."
As a result, Facebook said it will now immediately restrict anyone from using the feature if they break any of its policies for a period of time. For example, first-time offenders will be hit will a 30-day ban.
"For instance, someone who shares a link to a statement from a terrorist group with no context will now be immediately blocked from using Live for a set period of time," Rosen wrote.
Facebook added that it plans to extend these restrictions to other areas over the coming weeks, beginning with preventing "those same people from creating ads" on its platform.
The announcement comes amid pressure from leaders in New Zealand and France who are encouraging tech giants to do more in limiting the spread of hate content online.
On Wednesday, a new agreement called the Christchurch Call is expected to be announced at a meeting of digital leaders for the Group of Seven nations, according to CNN.
Additionally, Facebook added that its plans to invest $7.5 million in new research partnerships to improve its image and video analysis technology to better identify manipulated media.