Facebook and YouTube have announced that they are moving to reduce the spread of misleading health care claims after a media report showed the proliferation of bogus cancer cures on social media. Fin24 reports that Facebook said it made changes to its page-ranking algorithm to reduce “posts with exaggerated or sensational health claims” and attempts to sell products based on these claims. YouTube said separately it was taking similar actions.
Facebook said it made changes last month as part of efforts to reduce the spread of misleading medical claims including from groups opposing the use of recommended vaccines. “In order to help people get accurate health information and the support they need, it’s imperative that we minimise health content that is sensational or misleading,” Facebook product manager Travis Yeh said in a blog post.
“We handled this in a similar way to how we’ve previously reduced low-quality content like clickbait: by identifying phrases that were commonly used in these posts to predict which posts might include sensational health claims or promotion of products with health-related claims, and then showing these lower in news feed.”
YouTube said it has been working for some time to reduce the spread of misinformation on the platform. “Misinformation is a difficult challenge and any misinformation on medical topics is especially concerning,” a YouTube spokesperson is quoted in the report as saying.
“We’ve taken a number of steps to address this including surfacing more authoritative content across our site for people searching for cancer treatment-related topics, beginning to reduce recommendations of certain medical misinformation videos and showing information panels with more sources where they can fact check information for themselves.”
The Washington Post reports that the announcement of Facebook’s latest move against medical misinformation comes a week after it reported on the popular, private Facebook groups where hundreds of thousands of cancer patients and their families seek natural alternatives to medical treatment. In groups such as “Alternative Cancer Treatments” (7,000 members), “Colloidal Silver Success Stories” (9,000 members) and “Natural healing + foods” (more than 100,000 members), members trade anecdotes as proof that alternative treatments can cure various cancers and other illnesses.
The report says the changes Facebook announced could mean that members of groups such as “Natural healing + foods” see fewer posts from other group members in their news feeds. However, those who prefer to navigate directly to group pages will still be able to easily access posts, including those containing medical misinformation.
The report says The Wall Street Journal has also reported on similar examples of cancer misinformation on the platform.
The report says NBC News reported in May on private Facebook groups that promote dangerous methods to “heal” autism in children, including forcing a child to ingest bleach.
According to the report, medical misinformation has thrived on platforms such as Facebook and Google, drawing millions of views and creating communities of believers devoted to sharing dubious health advice. Social media companies in recent months have taken steps to curb the spread of exaggerated and harmful medical claims. Those changes came after measles outbreaks across the US prompted renewed scrutiny of the role that social media platforms play in the spread of anti-vaccine conspiracy theories.
In March, Facebook announced measures it would take to combat anti-vaccine content. The platform would stop recommending that users join groups that contain anti-vaccine content and would block advertisements promoting those conspiracy theories.
But, the report says, even as social media companies have moved to stop anti-vaccine misinformation, the online world has remained saturated with medical misinformation. Until very recently, the top search results on YouTube for “cure for cancer” included videos falsely promising that baking soda and certain diets could cure cancer. Those videos have millions of views.
In May, YouTube changed how it treats search results for “cure for cancer” and several other cancer-related terms; although the videos containing viral misinformation are still available on YouTube, they no longer appear in the top results, and authoritative sources are displayed prominently.