Social media platform TikTok announced it will automatically remove videos showing nudity or sexual activity as it tightens up its safety policy for minors.
"Automation will be reserved for content categories where our technology has the highest degree of accuracy, starting with violations of our policies on minor safety, adult nudity and sexual activities, violent and graphic content, and illegal activities and regulated goods," the Chinese video sharing company said in a press release Friday.
The new policy will roll out over the next few weeks and will apply to users in both the United States and Canada.
The company said the move comes so that TikTok’s moderators won’t have to view as many distressing videos for removal, and will allow users a safer experience. Moderators had previously viewed all videos before ruling on whether to remove content.
Moderators will also now have more time to focus on regulating videos involving hate speech, bullying, and misinformation, according to TikTok's head of US safety, Eric Han.
The announcement comes as the company looks to be more transparent with how it moderates content, such as reporting how many accounts belonging to children under the age of 13 it removes and changing how it notifies users when they violate standards.
Fox News reached out to TikTok for comment but did not get an immediate response.