clock menu more-arrow no yes mobile

Filed under:

How Trump and his son helped make a Covid-19 conspiracy theorist go viral in a matter of hours

A doctor who thinks alien DNA is used in medicine now says hydroxychloroquine is the cure for Covid-19.

Donald Trump and Donald Trump Jr. stand outside the White House.
Twitter suspended Donald Trump Jr. after he shared a video that claimed hydroxychloroquine was “a cure” for Covid-19. President Trump shared the same video with his followers through a retweet.
Mark Wilson/Getty Images

Open Sourced logo

Social media platforms are struggling to contain a new round of coronavirus conspiracy theories due, in part, to Donald Trump.

On Monday night, the president retweeted accounts that posted a video falsely claiming that hydroxychloroquine cures Covid-19, including one tweet from his son Donald Trump Jr. Many of those tweets were later removed, and Twitter suspended several of the users behind them, including Trump’s son, for 12 hours. But the video itself has continued to spread across social media platforms, raising fresh questions about how companies like Facebook and Twitter handle misinformation.

The video in question, which Trump Jr. called a “must watch,” features Houston doctor Stella Immanuel, who claimed that a combination of hydroxychloroquine, zinc, and the antibiotic Zithromax was a “cure” for the coronavirus and that “you don’t need to wear a mask.” The Food and Drug Administration (FDA) has said that hydroxychloroquine is “unlikely to produce an antiviral effect,” and the Centers for Disease Control and Prevention (CDC) recommends wearing masks to stop the spread of the virus. The video of Immanuel quickly went viral, drawing millions of views on Facebook, Twitter, and YouTube in a matter of hours.

The event itself also had some political support. Immanuel was speaking at a gathering called the “White Coat Summit,” held on Monday by a group called America’s Frontline Doctors. The press conference, which was held on the steps of the Supreme Court, was organized by the right-wing group Tea Party Patriots and also featured Rep. Ralph Norman (R-SC). A spokesperson for Norman told Recode that he didn’t know ahead of time what Immanuel was going to say.

“While the Congressman does not agree with her statement on the use of masks, and certainly has no expertise in medications, he strongly believes that she has a right to say what she came to say without being censored by big tech,” Rep. Norman’s spokesperson said.

What got Don Jr. suspended from Twitter

According to the Daily Beast, Immanuel has a history of strange medical claims, including that fibroids and cysts are caused by having sex with demons in dreams and that alien DNA is being used in medical treatments. Records from the Texas Medical Board show that Immanuel, who was trained as a pediatrician, is a licensed physician with a practice in Houston that has the same address as her church, Fire Power Ministries. Immanuel’s sermons, several of which are hosted on her YouTube channel, include messages of support for President Trump. Most of those videos have only a few thousand views.

Why this specific video about hydroxychloroquine went viral so quickly is likely a combination of the high profile of the right-wing personalities and publications that shared it and the controversial subject matter. Although Twitter attracted attention by suspending Don Jr.’s account for posting the video of Immanuel’s speech, the source of the virality appears to be the conservative publication Breitbart, which shared a livestream of Immanuel’s speech to its 4.7 million Facebook followers. Facebook later removed the post, but not before it got millions of views on the platform.

“We’ve removed this video for violating our policies that prohibit false claims about cures for Covid-19, since no such cure currently exists,” a Facebook spokesperson told Recode in a statement. “Under this policy, we remove posts that make claims like this video did that hydroxychloroquine is an absolute cure for Covid-19. From April to June we removed more than 7 million pieces of content on Facebook and Instagram for violating this policy and have shown messages to people who have reacted to, commented on or shared this kind of content.”

YouTube also says it has taken steps to remove the video. “We have removed the video for violating our COVID-19 misinformation policies,” a spokesperson for Google, which owns YouTube, told Recode.

The Breitbart detail is especially problematic for Facebook. The social network drew criticism after it named Breitbart as a partner in its Facebook News initiative, which involved collecting trusted news sources in a dedicated tab. Once described by co-founder Steve Bannon as “a platform for the alt-right,” Breitbart is also known to spread misinformation. A Facebook spokesperson confirmed to Recode that Breitbart is still eligible to appear in the News tab — meaning it could appear in people’s individual News tabs — but its content has never appeared in the top stories feature, which is what’s curated by Facebook employees.

Meanwhile, hydroxychloroquine continues to be the source of many conspiracy theories, so one might expect social media companies to be prepared to halt the spread of dubious content related to the topic. Some proponents of these theories simply deny the results of studies showing that hydroxychloroquine is not an effective cure, while others cite anecdotes they hear from doctors, quote evidence from less-than-ideal trials, and cherry-pick their facts. According to John Gregory, a senior analyst for health at NewsGuard, an app that rates news site trustworthiness, there’s also “a larger, vaguer narrative that hydroxychloroquine is being held back because Trump spoke about it, and because it’s politically beneficial for Democrats in the United States if hydroxychloroquine doesn’t work.”

But it’s not entirely clear where any of these social media platforms draw the line between actionable misinformation and allowable claims about hydroxychloroquine. After all, this isn’t the first time that the major social media platforms have been forced to chase down viral videos involving doctors making unproven medical claims.

In early May, a conspiracy theory video called Plandemic gained millions of views after being shared in conspiracy theory Facebook groups, including those associated with the group QAnon. The 26-minute video featured a discredited doctor making a slew of false claims, like the idea that the novel coronavirus was manipulated in a lab, and pushed the idea that hydroxychloroquine could be an effective treatment for Covid-19. The Plandemic video was removed from the major platforms, but not before it “had been viewed more than eight million times on YouTube, Facebook, Twitter and Instagram, and had generated countless other posts,” according to the New York Times.

The video featuring Immanuel and shared by the Trumps appears to have gone viral much more quickly and garnered millions more views. According to preliminary research from Zignal Labs, a media intelligence platform, the original video livestream received about 18 million engagements on Facebook. There were more than half a million mentions of the video on Twitter and other non-Facebook platforms as of Tuesday morning. Other estimates put the number of views of the video at over 20 million.

How Facebook and Twitter keep struggling with Covid-19 misinformation

Facebook has generally tried to take a mostly hands-off stance when it comes to the president’s posts. The company did remove campaign ads that promoted a “census” that was not the official census and that featured imagery associated with Nazis, but it declined to act on a Trump post that said “when the looting starts, the shooting starts” in reference to nationwide anti-police brutality protests. To date, Facebook has not deleted any of Trump’s posts related to hydroxychloroquine.

Twitter has been more aggressive in its dealings with the president. The company first fact-checked Trump in May, applying a warning label to two tweets that included misinformation about voting-by-mail. A Trump tweet that included the same “shooting ... looting” language that was in his Facebook post also received a warning label. In this week’s incident involving the Immanuel video, several tweets that were retweeted by President Trump have been deleted, although others claiming that hydroxychloroquine can treat the coronavirus remain up as of publication. (Twitter did not respond to a request for comment.)

Still, what Trump tweets about hydroxychloroquine can influence online conversations. According to research from Timothy Mackey, a public health professor at UC San Diego, the president’s posts seemed to contribute to an increase in discussion of both hydroxychloroquine and other Covid-19-related medications on social media in the past day.

“Hydroxychloroquine social media conversations spike when there are statements from the Trump administration and officials, which create an echo chamber of related news coverage,” Mackey explained in an email. “That then leads to aggregators [and] supporters of Trump who amplify that message [and] misinformation, and then that gets exposed to other users, who react with both positive and negative sentiment.”

Some might argue that any posts claiming that hydroxychloroquine is a cure for Covid-19 should get flagged for misinformation, if only because most scientific studies tell us that the drug doesn’t make a difference. Much like mask-wearing, however, the use of hydroxychloroquine has become politicized.

Typically used to prevent malaria and treat certain autoimmune disorders, hydroxychloroquine was identified as a potential treatment for the coronavirus after a February report from a French doctor claimed that a combination of the drug and Zithromax cured 100 percent of his small sample size of coronavirus patients. Despite later, more in-depth studies that have shown the drug combination has no apparent impact on the virus, conservative pundits and then the president seized on the idea that the drug might be some sort of cure. That has not proven true after more research. The National Institutes of Health halted its clinical trials of the drug on June 20, and the FDA revoked its emergency use authorization of hydroxychloroquine on July 1. Yet claims that hydroxychloroquine is an effective treatment for Covid-19 persist, with Trump claiming he took a two-week course of the drug in May.

Again, Twitter has taken an assertive approach when it comes to misinformation about Covid-19 treatments. Following the virality of the Immanuel video, the company made “Hydroxychloroquine is not an effective treatment for Covid-19, according to the FDA” one of its trending topics. This happened just hours after Trump castigated Twitter for never featuring “good” trends about him, calling the omission “illegal.” Twitter did not append a fact-check to that tweet, but there is no US law that says social media platforms have to feature good trending topics about the president.

As is often the case with social media companies moderating users who violate their policies, some are accusing Facebook and Twitter of censorship. The very fact that tech companies take down certain content can motivate other users to post it again. In fact, this process can be part of a strategy that involves posting controversial content on multiple platforms and domains so that the moderators end up in a game of whack-a-mole. And in many cases, a post or video might be too far on its way to going viral before Facebook or Twitter realizes that misinformation is being spread.

“It’s extremely difficult for them to detect these things before they’re uploaded,” explained James Grimmelmann, a law professor at Cornell who studies content moderation. “You’re not going to be able to use AI to distinguish true claims about Covid from false claims about Covid from false claims being presented to criticize.”

To avoid the problem of misinformation being shared too widely, Grimmelmann suggests that Facebook, Twitter, and other social media companies should set a threshold for looking at content that appears to be going viral and then have human reviewers look at it. “You should be able to have somebody look at it and make the decision before it has been [seen by] 17 million,” Grimmelmann said.

At this point, it might be too late for this new Covid-19 misinformation. On Monday evening, Immanuel complained that her Facebook profile page and videos had been removed, threatening that the entire platform would “be down in Jesus name” if they were not restored. She then went silent for about 13 hours. By Tuesday afternoon, Immanuel was back, and tweeted her video again. It gained 104,000 views and 11,200 retweets in 90 minutes and has yet to be removed.

Update 7:55 pm July 28, 2020: This article has been updated to include a statement from Facebook and additional commentary from experts.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.