clock menu more-arrow no yes mobile

Filed under:

Why Instagram’s anti-vaccine problem isn’t surprising

The image-based platform has for years been a gold mine for hucksters selling medical misinformation or outright lies.

A syringe vial containing a vaccine.
An investigation by the Atlantic found Instagram teeming with anti-vaccine memes and hashtags.
Joe Raedle/Getty Images

Instagram is full of conspiracy theories and anti-vaccine pages, Taylor Lorenz reported for the Atlantic on Thursday.

The Facebook-owned platform, she writes, “Is teeming [with] conspiracy theories, viral misinformation, and extremist memes, all daisy-chained together via a network of accounts with incredible algorithmic reach and millions of collective followers.”

The Atlantic’s investigation comes just two weeks after Facebook VP of global policy management Monika Bickert wrote, in a company blog post, that Facebook was working to “tackle vaccine misinformation.” This included a promise not to “show or recommend content that contains misinformation about vaccinations on Instagram Explore or hashtag pages.” In response to the investigation, a spokesperson told the Hill, “As part of our work to address health-related misinformation on Instagram, we’re looking at ways to minimize recommendations of this content and accounts that post it across Instagram.” The spokesperson specifically listed the Explore tab, hashtags, and “Suggested For You” as areas of focus.

But Instagram seems to be making slow progress in removing anti-vaccine content. CNN reports that the platform has blocked hashtags like #vaccinescauseautism and #vaccinesarepoison, the most obvious disinformation campaigns. Still, on Friday morning, when I typed “vaccines” into the Instagram search bar, the first suggested result was the account @vaccinesuncovered, which has 43,000 followers and the bio “Real Stories. Vaccine Injuries. Vaccine Deaths. What mainstream media won’t show you,” and links to a “gentle vaccine detox.” After the generic #vaccines hashtag (which is full of anti-vaccine content as well), the next suggested results were the account @vaccines_are_toxic, the hashtag #vaccineskill, @christiansagainstvaccines, and @vaccinesaregenocide.

As Lorenz points out, part of the broader issue with misinformation on Instagram is that it’s not readily obvious to non-teenagers that the platform is being used for anything at all other than sharing personal photos. Her reporting over the last several years has made it clear, though, that many younger people treat the platform as a primary news source, a personal blog, and a place to develop a worldview that is often pushed through prisms of irony.

A collection of anti-vaccine memes posted by @vaccines_are_toxic.

Despite the thorough scientific debunking of the most popular claims about the dangers of vaccines, Instagram isn’t the only major online platform with a misinformation problem, or even specifically an anti-vaccine problem. In the first place, it’s an extension of Facebook’s long-standing issue with the promotion of fake science and conspiracy theories. Both YouTube and Pinterest have been pressed to take serious action in the last month, and Amazon recently de-listed all of the anti-vaccine books for sale on its website. (A recent report from Vox’s Julia Belluz also emphasized that social media is only one part of the thoroughly bizarre and dangerous spread of anti-vaccine sentiment, which has been disseminated through traditional media like books, movies, and celebrity-financed documentaries.)

In a follow-up report for Motherboard, published Thursday, reporter Joseph Cox highlights the importance of Instagram’s recommendation engine, which can push a user who might follow one misleading or malicious medical page to following a couple dozen within a matter of minutes. Notably, the algorithm conflates all kinds of health-related content and bounced Cox between anti-vaccine pages and accounts interested in “plant-based diets.”

That’s why, though the Atlantic piece is disturbing and revelatory, it’s not exactly surprising that content like this is spreading on Instagram. It’s a platform built to literally trade in appearances, and to commodify illusions of health and beauty and domestic bliss, and has been used — for years — to make money for companies or individuals with an image to sell and not much science to back it up.

Last month, Suzanne Zapello reported for Vox on the rise of “medical sponcon,” explaining how big pharmaceutical companies have started partnering with influencers to sell new drugs and medical devices, using them to build trust in and excitement around a new product by posting images of beautiful results and editing out anything uncomfortable. Last year, the fertility-tracking app Natural Cycles blew up on Instagram and marketed itself as foolproof contraception, hiding its 93 percent effectiveness and numerous resultant unwanted pregnancies behind a curated selection of perfect smiles. The laxative tea — or teatox — industry is legendary on Instagram at this point. Influencers have notoriously been paid well into the six figures to promote the weight-loss quick-fix, widely reviled by medical professionals. Then it was adaptogens. Now it’s celery juice.

The wellness boom is, obviously, not all evil, and drinking celery juice for a few weeks because a celebrity said to is likely gross but not on the same plane as denying your kid a vaccination. But when recommendation algorithms tie all the beautiful and the false together into one big glamorous spiderweb, it can be hard to know when to wriggle out.

Want more stories from The Goods by Vox? Sign up for our newsletter here.