Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

The surgeon general wants Facebook to do more to stop Covid-19 lies

Dr. Vivek Murthy considers social media misinformation to be a deadly public health threat.

Vivek Murthy, the US surgeon general, speaks during a news conference and holds up a piece of paper that says, “confronting health misinformation.”
Vivek Murthy, the US surgeon general, speaks during a news conference and holds up a piece of paper that says, “confronting health misinformation.”
Dr. Vivek Murthy, United States Surgeon General, is taking a stand against health misinformation.
Samuel Corum/CNP/Bloomberg via Getty Images
Sara Morrison
Sara Morrison was a senior Vox reporter who covered data privacy, antitrust, and Big Tech’s power over us all for the site since 2019.

US Surgeon General Vivek Murthy says that misinformation — much of it on tech platforms — is a public health threat that has cost people’s lives and prolonged the Covid-19 pandemic.

As Murthy said in a Thursday press conference, health advisories are usually about things people physically consume: food, drinks, cigarettes. But the first advisory of his tenure in the Biden administration (he was also the surgeon general under President Obama) is about what we consume with our eyes and ears: misinformation.

The advisory comes with a set of guidelines on how to “build a healthy information environment,” with recommendations for everyone from social media users up to the platforms themselves (also: health workers, researchers, and the media). Murthy also went on some of those very platforms to spread the message, including Twitter and Facebook.

“Today, we live in a world where misinformation poses an imminent and insidious threat to our nation’s health,” Murthy said in a press conference, adding that “modern technology companies” have allowed misinformation and disinformation to spread across their platforms “with little accountability.”

The advisory isn’t a set of orders that must be followed by these companies, but the increased scrutiny and attention does put pressure on them to more aggressively combat the falsehoods spreading on their platforms.

Sen. Josh Hawley (R-MO), a frequent Big Tech critic, has already pushed back against the advisory, accusing Facebook and Twitter of colluding with the Biden administration to censor speech. Press secretary Jen Psaki told reporters that the White House has been in contact with those platforms and flags problematic content to them, which Hawley interpreted to mean that the platforms have “functionally become arms of the federal government.”

The health advisory comes as Covid-19 vaccination rates in the United States are dropping, cases are picking back up, and the fast-spreading delta variant takes hold. The vast majority of Covid-related hospitalizations and deaths have been for people who aren’t vaccinated, despite the widespread availability of vaccines in the US. And with some people choosing not to get vaccinated because they believe misinformation about the vaccines, the Biden administration has reportedly decided it’s time to fight back.

Coronavirus misinformation doesn’t only appear on social media. But social media gives it a stage and reach that offline platforms don’t have, and this has been a concern for years. Mis- or disinformation potentially influenced the outcome of the 2016 presidential election, increased political polarization, contributed to the rise of the QAnon conspiracy theory, played a role in the ethnic cleansing of the Rohingya Muslims in Myanmar, and, now, has helped to prolong the pandemic.

As researcher Carl T. Bergstrom, co-author of “Stewardship of global collective behavior,” a paper that calls for more research into social media’s impact on society, told Recode’s Shirin Ghaffary, “social media in particular — as well as a broader range of internet technologies, including algorithmically driven search and click-based advertising — have changed the way that people get information and form opinions about the world. And they seem to have done so in a manner that makes people particularly vulnerable to the spread of misinformation and disinformation.”

For their part, social media platforms have made attempts to stop the spread of false information, including removing posts and videos and banning accounts that spread it, as well as appending fact-checks or links to trusted information on posts and videos that might be misleading. As it became more likely that there would soon be a Covid vaccine at the end of 2020, various platforms were proactive in preparing for the vaccine misinformation that would (and did) inevitably follow. This came after years of these companies doing very little to stop the spread of misinformation about other vaccines, and despite many warnings from experts about the potential harm to public health done by hosting anti-vaccine content and communities.

“We agree with the Surgeon General — tackling health misinformation takes a whole-of-society approach,” a Twitter spokesperson told Recode in a statement. “We’ll continue to take enforcement action on content that violates our COVID-19 misleading information policy and improve and expand our efforts to elevate credible, reliable health information — now, amid the COVID-19 pandemic — and as we collectively navigate the public health challenges to come.”

YouTube spokesperson Elena Hernandez told Recode that the platform “removes content in accordance with our COVID-19 misinformation policies, which we keep current based on guidance from local health authorities. We also demote borderline videos and prominently surface authoritative content for COVID-19-related search results, recommendations, and context panels.”

And Kevin McAlister, of Facebook, told Recode that the company has “partnered with government experts, health authorities, and researchers to take aggressive action against misinformation about COVID-19 and vaccines to protect public health,” removing millions of pieces of Covid-19 misinformation while trying to guide users to trusted sources about the virus and vaccines.

But many believe their efforts are too little, too late, and still don’t go far enough — including, it seems, the surgeon general.

“We expect more from our technology companies,” Murthy said.

Let’s see if we get it — and if, at this point, it will help.


Update, July 16, 11:45 am ET: Added Sen. Hawley’s statement and a comment from Facebook.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh