Facebook says it’s now treating dubious health content like clickbait and placing them lower in users’ News Feeds.
Continue Reading Below
The move comes as Facebook has been struggling to address misinformation posted on its site. The first update, which Facebook product manager Travis Yeh said went up last month, reduces posts with exaggerated or sensational health claims, such as a “miracle cure.” The second update reduces posts attempting to sell products or services based on health claims, such as weight loss medication.
“We know that people don’t like posts that are sensation or spammy, and misleading health content is particularly bad for our community,” Yeh said.
Facebook said it’s dealing with the posts in a way similar to how it has handled other low-quality content, by identifying common phrases used in the posts.
The announcement about downplaying spammy medical content came out the same day as a Wall Street Journal report detailing how Facebook and YouTube have become “flooded with scientifically dubious and potentially harmful information about alternative cancer treatments.”
YouTube has also been targeting the bogus cancer content on its platform, the Journal reported. A company spokesman told the newspaper that it’s been working with medical doctors to identify unproven claims and conspiracy theories. It has also been cutting off advertising to channels that post the videos and tweaked its algorithms to reduce the number of times those videos are shown.
Phony claims about cancer treatment aren’t the only pieces of medical misinformation spreading online. In April, the Associated Press reported that Facebook had stopped recommending groups and pages that spread hoaxes about vaccines.
Some of these false claims can be harmful. The Journal’s report pointed out that videos advocating the use of a cell-killing ointment as a cancer treatment had been viewed millions of times on YouTube. But Dr. David Gorski, a professor of surgery at Wayne State University School of Medicine in Detroit, said the ointment can inadvertently burn or kill healthy skin, potentially leading to an infection, and that it wouldn’t remove a cancerous growth under the skin.
Efforts to combat the misinformation so far haven’t been entirely successful, and tech company leaders generally don’t want to impinge on users’ ability to share content. Facebook CEO Mark Zuckerberg recently said he believes it would be an overreach for the company “to say, ‘Hey you shouldn’t be able to say something that is not correct to your friends.’”
In the meantime, Facebook said in its announcement this week, it will “continue working to minimize low-quality health content on Facebook.”