Facebook has removed more than 78,000 profiles on both Facebook and Instagram that violated its policies against posting content related to QAnon conspiracy theories and militarized social movements between August 2020 and Jan. 12, 2021.
The company, which owns Instagram, has also removed more than 37,000 Facebook pages, groups and events related to QAnon and militarized social movements. Facebook has identified more than 890 militarized social movements since August.
QAnon is the conspiracy theory surrounding an anonymous internet persona named "Q" who claims President Trump is taking part in a secret battle against a satanic child sex trafficking ring, cannibals and the "deep state." It began on the 4chan web forum but found more attention on Facebook.
"We continue to strengthen our enforcement by identifying additional militarized social movements, new terms associated with QAnon and how people attempt to skirt our detection, including focusing more on Facebook profiles used to organize and promote these movements and groups on our platform," a Facebook spokesperson told FOX Business.
The spokesperson continued: "We’ll continue consulting experts to inform our strategy and will identify and remove content accordingly. These groups are constantly working to avoid our enforcement and we’ll continue to study how they evolve in order to keep people safe."
Facebook began removing accounts related to QAnon starting in May, according to two employees involved in the effort who spoke with The New York Times.
"We have seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them or have individual followers with patterns of violent behavior,” Facebook said in an August statement.
The company furthered restrictions against content linked to QAnon theories and militarized social movements in October.
"We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update," Facebook said in October.
Facebook gave an example of “QAnon content that celebrates and supports violence," such as "content tied to different forms of real world harm, including recent claims that the West Coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public."
Lawmakers have pointed to social media platforms like Facebook, Twitter and smaller websites like Parler as places where incitements of violence and radicalization can gain traction, especially now that people spend more time away from others and seek engagement online amid the pandemic.
FOX Business' Michael Ruiz and Nick Givas contributed to this report.