Information Space Integrity

Using the Truth to Tell a Lie: Authoritarian COVID-19 Vaccine Mal-Information Strategies

By Jessica Brandt and Bret Schafer

Over the past year, the regimes in Russia, China, and Iran have used a combination of public diplomacy, propaganda, and overt and covert disinformation campaigns to portray their respective responses to the pandemic as superior to those of the West. This tactic is in keeping with a broader strategy of denigrating Western democracies and highlighting the ostensible strengths of their own respective governance models. That vaccines have become the latest flashpoint in this emerging competition of ideas is not surprising, given that they are a pathway to power in multiple forms, from market power to soft power, and can be leveraged for sharper forms of political impact.

Much of the reporting on the use of information operations to manipulate vaccine narratives has focused on disinformation. In doing so, it misses the steady drumbeat of factual, but misleading coverage—for example, selective and repeated reporting on adverse reactions to certain vaccines—that can shape public opinion over time.

This use of “mal-information,” or genuine information presented without proper context to deceive audiences, has enabled the regimes in Moscow, Beijing, and Tehran and their respective state media to claim that their coverage has been misrepresented. In response to a recent Alliance for Securing Democracy (ASD) report that criticized Russian media’s manipulation of vaccine narratives, RT asserted that “demanding ‘context’ is a favorite trick of so-called fact-checkers to argue that something is false even when factually true.”

The use of mal-information is not merely a rhetorical device; it has real impact. Typically, social media, civil society, and government responses to disinformation are oriented toward exposing and mitigating falsehoods, not half-truths. Though skilled fact-checkers eschew binary, “fact versus fiction” ratings to avoid this trap, the use of cherry-picked or poorly contextualized information allows manipulators to deceive audiences while skirting rules, norms, and countermeasures designed to combat outright disinformation.

An analysis of 35,000 vaccine-related messages from Russian, Chinese, and Iranian diplomats, government officials, and state media outlets on Twitter, captured by ASD’s Hamilton 2.0 dashboard over a three-month period, suggests three trends related to these regimes’ use of mal-information:

  • Rather than promote verifiably false information about certain Western vaccines, Moscow, Beijing, and Tehran regularly sensationalize reports of safety concerns and downplay mitigating context. Pfizer’s vaccine, for example, has received disproportionate negative attention, perhaps because, as the first approved Western vaccine, it is seen as the primary competition. Russian state media in particular has regularly used cherry-picked examples, stripped of context, to imply a causal connection between the administration of Pfizer vaccines and subsequent deaths. And Iran’s Fars News Agency tweeted that the Pfizer vaccine “kill[ed] six people in America,” omitting that four of the six people who died had received a placebo and that authorities ruled out a causal connection to the deaths of the other two. The tweet linked to a Fars News article that provided those details, but research has shown that headlines (or tweets) have an outsized impact on how audiences perceive the information that follow—and that’s if they even choose to read it.
  • Russian, Chinese, and Iranian state media each promote the idea that Western media coverage is biased against Russian and Chinese vaccines and regularly ignores safety concerns related to Western-developed vaccines. These claims are contradicted by evidence. For example, U.S.-government funded Radio Free Europe/Radio Liberty mentioned Sputnik V more than 20 times during the studied period. Only four of those tweets (or 17 percent) were negative. Major incidents that Western media was accused of ignoring—for example, deaths in a Norwegian nursing home after the administration of the Pfizer vaccine—were indeed covered by multiple international outlets, such as AP, Bloomberg, CNN, and Reuters.
  • Moscow and Beijing are sophisticated in tailoring their vaccine messages to target audiences, particularly in the global South. For example, China’s state media have highlighted the challenges of cold chain storage in warm environments and questioned vaccine manufacturers’ motives in places that have a traditionally anti-capitalist bent. At the same time, Beijing has positioned its own vaccines as readily accessible global public goods available to developing countries, when in fact it has underperformed in delivering these vaccines. Russia, too, has accused Western vaccine manufacturers of “profiteering” at the expense of human lives in low-income countries, while promoting the affordability of Sputnik V, even though the African Union will pay three times more for it than for doses of the Oxford-AstraZeneca vaccine.


Mal-information is a unique vector of authoritarian information manipulation, and democracies need a coordinated strategy to contend with it. After years of focusing on exposing inauthentic accounts, disrupting manipulative behavior, labeling inaccurate information, and removing offending content, practitioners in government, the private sector, and civil society must now develop strategies to address mal-information, which can be as damaging as explicit disinformation, but far more difficult to fact-check and moderate.

To implement this strategy and combat mal-information, action will be needed from all corners of society. Democratic governments can bring new energy to public diplomacy efforts of their own by highlighting the transparent, norm-respecting, and scientific approach to their vaccines. Social media platforms can take proactive steps to identify, verify, and amplify authoritative voices on matters that could foreseeably result in offline harm, which would help drown out mal-information. They can also improve labeling of state-sponsored content so that it is more readily visible to users. Academics and other researchers who are already doing important work exposing and countering disinformation in their respective societies can use their tradecraft to document mal-information. Existing media literacy efforts can also raise public awareness of the challenge. Meanwhile, independent media—including local journalists, and especially in closed spaces—can provide accurate, contextualized information.

Doing so should help dispel a notion authoritarians regularly promote: that there is no such thing as objective truth. Further reflection is needed about how to build societal resilience to mal-information. Developing a nuanced picture of this activity as it is occurring is the first step.


Jessica Brandt is head of policy and research for the Alliance for Securing Democracy and a fellow at the German Marshall Fund of the United States. Bret Schafer is the Alliance for Securing Democracy’s Media and Digital Disinformation Fellow and is the creator and manager of Hamilton 2.0, an online open-source dashboard tracking the outputs of Russian, Chinese, and Iranian state media outlets, diplomats, and government officials. Follow them on Twitter @jessbrandt and @SecureDemocracy.

The views expressed in this post represent the opinions and analysis of the authors and do not necessarily reflect those of the National Endowment for Democracy or its staff.


Image credit: Gustavo MS_Photography /