Inside the CDC’s Campaign To Police COVID Speech

For years, various branches and levels of the federal government tried their darndest to grab more control of what could be said on social media. Sex trafficking, election integrity, hate speech, Chinese influence—all served as fodder for legislation, regulation, or executive action geared toward seizing the means of content moderation, with some degree of success.

But it was the COVID-19 pandemic that really did the trick—as my colleague Robby Soave details in Reason’s March 2023 cover story.

During peak pandemic times, “the federal government shaped the rules of online discussion in unprecedented and unnerving ways,” Soave writes. Some of this was confirmed recently by documents that Twitter CEO Elon Musk shared with journalists. Soave uncovered evidence of similar shenanigans at Facebook:

According to a trove of confidential documents obtained by Reason, health advisers at the CDC had significant input on pandemic-era social media policies at Facebook as well. They were consulted frequently, at times daily. They were actively involved in the affairs of content moderators, providing constant and ever-evolving guidance. They requested frequent updates about which topics were trending on the platforms, and they recommended what kinds of content should be deemed false or misleading. “Here are two issues we are seeing a great deal of misinfo on that we wanted to flag for you all,” reads one note from a CDC official. Another email with sample Facebook posts attached begins: “BOLO for a small but growing area of misinfo.”

These Facebook Files show that the platform responded with incredible deference. Facebook routinely asked the government to vet specific claims, including whether the virus was “man-made” rather than zoonotic in origin. (The CDC responded that a man-made origin was “technically possible” but “extremely unlikely.”) In other emails, Facebook asked: “For each of the following claims, which we’ve recently identified on the platform, can you please tell us if: the claim is false; and, if believed, could this claim contribute to vaccine refusals?”

Facebook, Twitter, and other tech companies were under extreme pressure to acquiesce to government demands on this front. In July 2021, President Joe Biden accused social media platforms of “killing people” by not stopping the spread of COVID-19 misinformation. And he was far from alone in using this sort of rhetoric.

All of this followed years of political leaders excoriating tech companies for not stopping the spread of everything from sex ads to Russian memes to white nationalist rhetoric. Throughout 2020 and 2021, tech executives were routinely hauled before Congress to answer absurd questions about their processes. Bill after bill sought to take away their protection from civil liability for things that users posted, to micromanage their algorithms and the way they handled specific sorts of content, to modify antitrust laws in ways that would cut into their business, etc. Facebook and Google both faced federal lawsuits.

So even if Centers for Disease Control and Prevention (CDC) authorities didn’t directly order Twitter and Facebook to do their bidding, you can imagine how tech executives might have felt they had little choice.

“If you look at it in isolation, it looks like [the CDC and the tech companies] are working together,” Jenin Younes, litigation counsel for the New Civil Liberties Alliance, told Reason. “But you have to view it in light of the threats.”

More from Soave:

Facebook is a private entity, and thus is within its rights to moderate content in any fashion it sees fit. But the federal government’s efforts to pressure social media companies cannot be waved away. A private company may choose to exclude certain perspectives, but if the company only takes such action after politicians and bureaucrats threaten it, reasonable people might conclude the choice was an illusion. Such an arrangement—whereby private entities, at the behest of the government, become ideological enforcers—is unacceptable. And it may be illegal.

* Article from: Reason