U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Cover of Addressing Health Misinformation with Health Literacy Strategies

Addressing Health Misinformation with Health Literacy Strategies

Proceedings of a Workshop—in Brief

; Alexis Wojtowicz, Rapporteur.

Washington (DC): National Academies Press (US); .
ISBN-10: 0-309-14385-3

December 2020

On July 29, 2020, the Roundtable on Health Literacy convened a public workshop to explore the challenges resulting from the proliferation of health and medical misinformation and disinformation, particularly as they relate to the coronavirus disease 2019 (COVID-19) pandemic. The virtual workshop explored the role of fact-checking organizations (FCOs) and the technology industry in addressing misinformation and disinformation, the social psychology behind their spread, and health literacy strategies to support this ongoing multidisciplinary work. This proceedings was prepared by the rapporteur as a factual summary of what occurred at the workshop. Statements, recommendations, and opinions expressed are those of individual workshop participants and are not necessarily endorsed or verified by either the Roundtable on Health Literacy or the National Academies, and they should not be construed as reflecting any group consensus.

Lawrence Smith, chair of the Roundtable on Health Literacy; executive vice president and physician in chief at Northwell Health; and dean of the Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, welcomed attendees to the virtual workshop. The workshop, he explained, would examine the rise of health misinformation and would use COVID-19 as a case study to explore health literacy strategies that may be used to mitigate such misinformation. Smith introduced the two moderators for the panel: Ruth Parker, professor of medicine, pediatrics, and public health at Emory University, and Laurie Myers, global health literacy director for Merck.


Parker opened the panel discussion with Kate Starbird, associate professor in the Department of Human-Centered Design & Engineering and director of the Emerging Capacities of Mass Participation Laboratory at Washington University, asking her to explain the differences between misinformation and disinformation.

Starbird noted that the distinction between misinformation and disinformation is “really important when we think about strategies for addressing false information, online or elsewhere.” The definitions are still in development as is the field of disinformation studies itself, she added, but misinformation is presently understood by most as “false information that is not intentionally false,” whereas disinformation is “false or misleading information spread with some kind of intent—usually political, reputational, or financial.” Disinformation, she continued, is not just one piece of information: It is part of a campaign or a set of different narratives, frequently with a factual or a plausible core wrapped in layers of false information or removed from its original context.

“It is not as simple as saying that piece of information is true or false,” she said, “but considering why it is being spread now, who is spreading it, and what is the intent for spreading it. That makes the challenge of identifying disinformation and removing malicious information a much different task than fact checking something as being true or false.”

“Misinformation doesn’t spread itself,” Starbird said. “We spread it—‘we’ being ‘everybody who participates in information spaces.’ And we, as humans, are particularly vulnerable to spreading misinformation during crises events like pandemics, due to the uncertainty of the information space” and anxieties borne out of that vacuum.

Parker noted that many workshop registrants had submitted questions before the workshop asking about how to identify signals that information they are consuming may not be entirely true.

Media literacy has often been concentrated on logically looking at the information in question along with its source, Starbird replied, but reactions to news or information are not always approached logically. There is a component of emotional manipulation that might compel someone to quickly share information online without considering it. That is not to say that someone should not share information because it made them emotional, she said, but individuals should reflect on the emotion they feel, why they feel that way, and what motives someone might have for inducing that emotion, ultimately slowing the process down between seeing information online and then sharing it with others.

Parker asked Starbird to share her thoughts on the evolution of science, its uncertainty, and how it relates to the current COVID-19 pandemic and misinformation.

Fields like crises informatics and the social psychology of rumor during crises events tell us that one of the reasons misinformation spreads during crises events has to do with uncertainty, Starbird explained. “We are not sure what actions we should take. We have a tendency to come together, try to gather information, and try to collectively make sense of it,” she added. The COVID-19 pandemic is particularly uncomfortable for people because they are dealing with months of uncertainty, as opposed to the 2 or 3 days after an earthquake. “Things are changing underneath our feet. The best understandings today are very different from the best understandings from a week or a month ago. With facts changing, we have to update our mental model around things, and a lot of us aren’t very good at that,” she said. Those vulnerable moments can create opportunities for people interested in spreading disinformation. Individuals can take advantage of the uncertainty in times of crises to spread or create false narratives, frequently in service of a political objective. She added, “that definitely makes this pandemic an even more complicated situation, and perhaps at this point, we can even begin to consider how that is costing lives in the United States.”

Parker asked if it would be helpful for most people using the Internet to consider approaching misinformation and disinformation differently. It is difficult, Starbird answered, because disinformation requires assessing intent, which is harder to assess compared with whether something is true or false. Several social media platforms have tried to develop policies to prevent the spread of disinformation, and even they struggle with that distinction, she noted. “We need better tools for understanding that, and we need better cues from social media platforms to be able to assess intent,” she said, adding that “we also need better information about where the information has originated.” While individuals should be perceptive and cautious participants in information spaces, she said, most do not yet have the resources to assess the difference between misinformation and disinformation. “That’s a platform design problem as much as anything else.”

Starbird explained that she studied rumors during crisis events in 2013 and observed the spread of disinformation. The disinformation included conspiracy theories that “seemed to be selectively amplified for political objectives.” At the time, she said, she thought it was marginal and did not think it worth pursuing further, but 2 years later, “we began to recognize that disinformation was becoming a bigger and bigger part of the picture.” There was infrastructure—network structures and connections between accounts—that habitually spread disinformation, she said, and it was beginning to reshape how information moved in information spaces. “Increasingly, we have plenty of cues indicating this is a really big problem—disinformation is being repeated by political leaders, not just in the U.S., but all over the world, and we’re seeing it show up in what we would consider mainstream spaces. As it moves from the margins to the center of conversations, I think we can recognize it as a significant problem.”


Parker next introduced Nat Gyenes, director of the Digital Health Lab at Meedan and research fellow at the Berkman Klein Center for Internet & Society at Harvard University. Gyenes explained that Meedan works directly with social media platforms and Internet search organizations to “strengthen information equity on the Internet.”

Through their research conducted at the Digital Health Lab, it has become clear to Gyenes and her team how important it is to reduce the stigma in health misinformation response work. However, she noted, it is a difficult balancing act to attempt to reduce the negative impact of health misinformation, while ensuring that community members feel comfortable asking questions about health myths.

The world has more Internet users than people with access to essential health services, Gyenes explained, and more than 80 percent of Internet users search for health-related information online. At the same time, she said, health misinformation is becoming an increasingly difficult and complex issue to address, and its consequences “disproportionately affect communities of color, communities with lower socioeconomic status, and queer communities.” Barriers to accessibility, language constraints, and content relevance (or lack thereof) can all exacerbate the negative effects of the proliferation of health misinformation, she noted.

What search engine audiences use and the information they are able to find depends on the Internet availability or digital resources to which they have access. For example, if an individual has access to the Internet through Facebook’s “Free Basics” program,1 which is a collaboration between Facebook and key mobile providers around the world to provide limited Internet access, then that individual can access only select information and not the World Wide Web. Understanding these barriers to Internet access are important, she said, because there are additional barriers, such as language fluency or lower literacy, which can affect whether an Internet user’s search terms match the nuanced language frequently used by public health authorities.

The current COVID-19 crisis has only reinforced the importance of effective collaboration between health authorities and actors in the technology community, Gyenes said. These collaborations can be strengthened by the involvement of FCOs, which are driving forward health communications in the digital information ecosystem.

FCOs have existed since the early 2000s and many are primarily known through their work investigating political misinformation. Gyenes explained there is one major standardizing body, the International Fact-Checking Network (IFCN), which promotes best practices and a shared code of principles for fact checkers.2 Other organizations may apply to be vetted as verified signatories to the code, she noted. For a few reasons, she said, those FCOs and their verification by IFCN are incredibly important to promoting health literacy. She added,

One reason FCOs are so important is because of their ability to work in direct collaboration with tech companies in responding to misinformation online. FCOs are the local actors who respond directly to the questions and misinformation circulating in their own countries and communities. In doing so, they improve equitable access to health information.

Their methodologies and the publication of those methodologies are both important to improving health literacy and how fact-checking processes take place, said Gyenes. In her opinion, the most important part of an FCO’s role is that it “unpacks claims or pieces of misinformation within the community context in which they are shared.” Members of communities are responding to misinformation that is affecting their communities. Because of that important expertise, she said, those responses to misinformation are “relevant,” and accessible to the communities they serve.

One issue, however, is that FCOs are often limited by the information or expertise that are readily available to them. If they cannot get in touch with a public health expert to comment rapidly, misinformation can spread faster than it can be addressed. The important role of health communications and public health literacy experts at this point, Gyenes said, is to “contextualize the latest science in ways that are accessible to fact checkers so that fact checkers can make it accessible to their communities.” She added that filling this need is the goal of Meedan’s public health expert database and tool kit project to respond to COVID-19 misinformation. In early 2020, Meedan built a team of infectious disease experts, health literacy practitioners, epidemiologists, pandemic preventionists, and vaccine uptake researchers to work directly with FCOs and local newsrooms to provide on-demand contextualization for the latest scientific research related to the COVID-19 pandemic. The project team fields questions from fact checkers and can provide responses in more than seven languages, she added.

Technology platforms around the world are already working with FCOs to address misinformation on their platforms. They do this by finding new information pathways to the audiences or by using fact-checked information to inform algorithms that determine “which content gets shared with an Internet user either first, or later, or never,” Gyenes said. For example, WhatsApp is currently working to address information accessibility for different communities, she added. In collaboration with Meedan, WhatsApp works with fact checkers directly by enabling the creation of “text bots.” Through such tools to customize text bots, FCOs can create their own to provide contextualized information to audiences and receive feedback through WhatsApp’s questions from audiences about content they want fact checked.

Gyenes noted that questions from those audiences have provided fascinating insights about the discrepancies between the information that individuals need or are curious about and the information published by public health authorities online. “These insights can definitely serve as an opportunity to improve the health communications and health literacy fields,” she said. Gyenes’s team also uses those insights to tailor its responses to health misinformation, ensuring that information is culturally relevant and culturally sensitive, Gyenes said, adding that “we want to ensure that our content is localized and not just translated.”

“Midinformation”3 differs from misinformation and disinformation in that it characterizes a kind of information crisis that occurs when not all of the facts are available: It is informational ambiguity based on scant knowledge or emerging scientific evidence, Gyenes said. To address it, “it’s important to make sure that the information that users see first when they search online is the information you want them to see for a given point in time.” One example of this is Google’s newer annotation tools, which label and highlight fact checks in Google Search and Google News results. Facebook has also partnered with FCOs, and when FCOs identify a piece of content as false and flag it to Facebook, “Facebook can integrate this information into their own content systems to hide or reduce the ability to view a particular piece of misinformation, which significantly reduces its distribution,” Gyenes continued. As FCOs have become so central to tech companies’ responses to health misinformation, Gyenes said, “public health and health literacy experts have an opportunity to collaborate, acting as a resource to fact checkers, supporting their work, and advocating for their work to keep their communities informed.”


Myers introduced the next panelist, Briony Swire-Thompson, a senior research scientist at the Northeastern University Network Science Institute and a fellow at the Harvard University Institute for Quantitative Social Science.

Health information is a unique area of a broader misinformation and disinformation issue, Swire-Thompson said, because there are often financial incentives that do not necessarily exist for other topics of misinformation. Health misinformation can have “particularly severe consequences regarding quality of life and risk of mortality,” she added (Swire-Thompson and Lazer, 2020).

COVID-19 is something of a perfect storm for health misinformation, Swire-Thompson said, “not least because it takes time for science to establish what is true.” Fake experts speak with certainty, she added, because “when you make information up, you don’t have to couch everything in the nuance that often accompanies the truth.” Also, the urgency with which scientists are publishing information might mean prepublications and final publications may differ in their findings, and journalists, for example, may not realize “the difference between new findings and established published product.” Swire-Thompson also noted that predatory journals can pose a problem, because they accept publications for monetary gain and do not have the traditional editorial processes that control for quality and accuracy. She also observed that some search engines like Google Scholar may not always reflect whether literature has been retracted.

Addressing health misinformation is a young field, Swire-Thompson said. There is evidence to suggest that critical thinking is a skill that can be taught, she added, but gauging the efficacy of critical thinking programs can be difficult and findings have been mixed. There is also converging evidence to suggest that older adults (65 years or older) share disinformation online seven times as frequently as 18–29-year-old adults (Grinberg et al., 2019; Guess et al., 2019). Teaching critical thinking at universities may not directly affect the older adult population: “We have to think about where we are implementing health literacy strategies.”

To correct health misinformation, there are several options supported by science, Swire-Thompson explained. One option would be to provide factual alternatives. In the traditional paradigm of correcting misinformation, individuals react well to having one core piece of information replaced by the correct information. In the case of COVID-19, she noted, we often have not yet known the correct alternative to misinformation. Another element of providing factual alternatives is that they should ideally be as simple as the original misinformation.

Providing warnings if misinformation will appear is also very effective, though it can be difficult to do this if you are not responsible for presenting the information, she noted. Repeating corrections can be effective as well, she said. There is some evidence that what we believe to be true is what we remember to be true, so repeating corrections multiple times should not be a cause for concern (Schacter and Scarry, 2001).

The “backfire effect” occurs when you present an individual with a correction and they strengthen their belief in the misconception you are hoping to rectify, Swire-Thompson said, adding that “this is not a robust empirical phenomenon. There have been widespread failures to replicate, researchers have been unable to elicit it under theoretically favorable conditions, and in some cases, the evidence is not very strong to begin with” (Swire-Thompson et al., 2020).

The backfire effect is often confused with the illusory truth effect, which occurs when people believe incorrect information after repeated exposure to it. However, Swire-Thompson said, as soon as you pair the misinformation component with the correction, belief does not increase. In fact, she added, “if you don’t repeat the original misinformation, people often don’t even know what you are trying to correct. It is very important to clearly and saliently pair the correction with the original misinformation.”

To close her talk, Swire-Thompson observed, “When people read clear corrective evidence, they are incredibly good at updating their beliefs.”


Myers introduced the fourth panelist, Wen-Ying Sylvia Chou, program director of the Health Communication and Informatics Research Branch at the National Cancer Institute (NCI) at the National Institutes of Health (NIH).

Chou explained that the rampant spread of information, especially in the online ecosystem, complicates how health literacy strategies can be developed and deployed. When considering these strategies, it is also important to consider intent, she said. Is it to sow division, to gain profit, to create chaos? A piece of COVID-19-related misinformation that convinces people to drink certain juices more is different than misinformation that convinces people not to wear masks or practice social distancing. Also, she said, the medium and its control over which information and in what way information is shared is an important component.

A growing body of evidence shows that divisive disinformation campaigns erode consensus, or the sense that there is consensus in the scientific literature, and erodes trust in experts, Chou explained. Echo chambers perpetuate these divisions, and falsehoods tend to spread easier and faster. Credible information is often complex, nuanced, evolving, and uncertain, she continued, and “These are important things in communication; anyone who has done work in health communication can attest to the importance of source, format, and health literacy of the community or the audience we’re communicating with.” In addition, she said, industry and government’s policies and practices toward misinformation and content moderation are rapidly evolving.

Chou developed a working taxonomy to identify six major COVID-19 misinformation topics and some examples of each (see Box 1).

Box Icon



Planning for the successful uptake of a yet-to-be-developed COVID-19 vaccine should include traditional and newer health literacy approaches, Chou said. Traditional health literacy approaches, she continued, would include proactively promoting vaccine literacy, with

  • interventions including targeted media campaigns;
  • tailored peer-to-peer, school-based, or community-based vaccine education; and
  • provider–patient communication.

One suggestion from digital literacy literature is to have strong, consistent messaging. “We also need to think about strategies that are already being deployed by anti-vaccine groups,” Chou added. Those strategies, which cannot simply be addressed with fact checking alone, include

  • Propagating rhetoric related to personal freedom and against government mandates
  • Discrediting agents involved in vaccine development
  • Targeting already mobilized groups and emotional topics

There are some newer strategies that may be effective, Chou said. While there is not a lot of established literature on this sort of the efficacy, she said, “I think these are worthy targets.”

Some novel communication strategies may include

  • Induce skepticism toward disinformation agents (similar to the discrediting of tobacco marketing)
  • Develop tools to help identify and access credible information sources and resources for debunking myths and misinformation
  • Cultivate science literacy: understanding the uncertain and evolving nature of science
  • Combat conspiracy theories by partnering with former members and trusted influencers
  • Mobilize the public health majority to counter online misinformation
  • Proactively monitor, flag, downrank, and remove content or accounts that promote misinformation; reconfigure platform features that amplify misinformation (e.g., Twitter’s handling of QAnon and Facebook’s and Google’s removal of misinformation videos)

These efforts can help address cognitive, emotional, social, and contextual factors of misinformation spread, she said.

Chou described a study in development at NCI—a randomized trial to look at the use of storytelling and narrative-based messages to promote recommended behaviors on COVID-19-related behaviors. The study would explore attitudes, beliefs, and behaviors at baseline and provide people with congruent messages—one in a personal experience narrative format and one in a non-narrative didactic format—to see which one is more effective at changing attitudes and behaviors. Similar research endeavors, Chou added, could help the health literacy and health communication fields to better know how to address health misinformation.

“The priority is to put health literacy in context,” she said. “It does not exist in a vacuum, and it’s not just about providing good information or filling in the gap where there is a lack of good information. We need to consider the role of technology, identity, values, biases, and emotions, and learn from examples of successful or effective communication.”

Chou noted that there are three areas in which health literacy approaches or interventions could be conducted in new ways (Chou et al., 2020; Peterson et al., 2020). The first, she said, is digital literacy. “It’s not just a matter of discerning a piece of health information. It’s about fostering fact-checking skills and an awareness of algorithms or techniques used to make you want to click on something or share a meme that gets you really excited or angry.” The second one is redefining what is meant by vulnerability. Traditionally, many of us think about limited health literacy in terms of limited English proficiency or limited education, she said, but vulnerability has taken on a new meaning: It can include those who operate in online information silos or who have conspiratorial mindsets. Health literacy interventions need to penetrate those silos, she said. Last, Chou added, any health literacy efforts need to consider the role of trust: How can we foster trust and restore trust as part of any health literacy initiative?


Myers observed that each speaker was optimistic about the health literacy community’s ability to address health misinformation, having each identified a variety of promising tactics and resources. One theme in questions from the audience members was the important role of so many players in the health system, she noted, including the roles of health information technology professionals, the technology industry, fact checkers, journalists, and clinicians. All are important in addressing health misinformation. She invited the panelists to reflect on the discussion.

Starbird agreed that health misinformation needs to be addressed from multiple perspectives, including health literacy, education, and social media platforms. Social media platforms have been making changes, but they need to continue to do so, she said, adding that there is also a role for government and policy, though it will be complicated. She noted that she appreciated Swire-Thompson’s research suggesting that the backfire effect should not be a consideration when addressing misinformation. “For years, we had been telling people not to correct other people online, telling journalists not to talk about it because it would amplify [the misinformation]. But we gave the wrong advice—it began to fester at the edges and move into the conversation.” But, she added, we started to develop and spread new norms around corrections with empathy.

Another element of disinformation campaigns that are difficult to address, Starbird said, is their intersection with authentic activism, in which politically motivated groups are targeted to become vectors of misinformation and disinformation around health and COVID-19. It is hard to address, she said, but it is important to find a way to help activist communities protect themselves from infiltrations by people that have other motives. It is about education, but it is also about platform design and platform policies, she added. “I think we need to work together holistically across the different sectors to address the problems.”

Gyenes added that the tech community has learned a lot from the public health community about communication, intervention design, controlling for factors and populations, and understanding that populations have nuanced needs when it comes to outreach. “It’s our hope,” she said, “that in coming to this discussion from the public health technology and psychology sectors, we can work to create more interdisciplinary solutions.”

Swire-Thompson echoed Starbird’s and Gyenes’s observations. The science is impacting the policy and vice versa, she said. Researchers have a responsibility to develop replicable evidence-based recommendations and social media platforms have a responsibility to make changes that make sense for a factual, evidence-based world.

Chou added: “We have seen more than 150,000 deaths already. From the perspective as a public health practitioner, and someone who cares about communication and health literacy,” she said, “we need to take novel approaches and we need to try something different. The traditional health literacy approaches have worked for certain things, but they are not working in the information ecosystem.” We can’t remain naïve, she said. “Health misinformation is obviously not a fringe topic, and we need to work together.” The people who study misinformation and disinformation need to be at the table when we are designing public health campaigns and messaging so we avoid the problem of inaccessible information, she said.

Parker thanked the panelists, noting that she was inspired by their thinking, research, work, and collaborative spirit. “I’m hearing some new horizons for health literacy,” she added, “Fact checking is probably a new horizon for health literacy: the role of having people who understand health and public health, and the various entities that are a part of it.” She noted her appreciation for Chou’s comments that “health literacy does not live in a vacuum,” as well as the emphasis on the importance of trust. Concluding the workshop, Parker observed that building trust is “truly foundational to our individual and collective lives.” ◆◆◆




For more information, see https:​//connectivity.fb.com/free-basics (accessed September 16, 2020).


For more information, see https://www​.poynter.org/ifcn (accessed September 16, 2020).


This Proceedings of a Workshop—in Brief was prepared by Alexis Wojtowicz as a factual summary of what occurred at the workshop. The statements made are those of the rapporteur or individual workshop participants and do not necessarily represent the views of all workshop participants; the planning committee; or the National Academies of Sciences, Engineering, and Medicine.

*The National Academies of Sciences, Engineering, and Medicine’s planning committees are solely responsible for organizing the workshop, identifying topics, and choosing speakers. The responsibility for the published Proceedings of a Workshop—in Brief rests with the institution.

The members of the planning committee were Laura Bartlett, National Library of Medicine; Jennifer Dillaha, Arkansas Department of Health; Ellen Markman, Stanford University; Michael M. McKee, University of Michigan School of Medicine; Laurie Myers, Merck Sharp & Dohme Corp.; and Ruth Parker, Emory University School of Medicine.


To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop—in Brief was reviewed by Christopher R. Trudeau, University of Arkansas at Little Rock William H. Bowen School of Law, and University of Arkansas for Medical Sciences Translational Research Institute, and Amanda J. Wilson, National Library of Medicine, National Institutes of Health. Lauren Shern, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator.


Rose Marie Martinez and Alexis Wojtowicz, Board on Population Health and Public Health Practice, Health and Medicine Division, National Academies of Sciences, Engineering, and Medicine

For additional information regarding the workshop, visit www.nationalacademies.org/HealthLiteracyRT

Health and Medicine Division


The nation turns to the National Academies of Sciences, Engineering, and Medicine for independent, objective advice on issues that affect people's lives worldwide.


SPONSORS: This workshop was partially supported by AbbVie Inc.; California Dental Association; Eli Lilly and Co.; Health Literacy Media; Health Literacy Partners; Health Resources and Services Administration; Mserck Sharp & Dohme Corp.; National Library of Medicine; Northwell Health; and Pfizer Inc.

Suggested citation:

National Academies of Sciences, Engineering, and Medicine. 2020. Addressing health misinformation with health literacy strategies: Proceedings of a workshop—in brief. Washington, DC: The National Academies Press. https://doi.org/10.17226/26021.

Copyright 2020 by the National Academy of Sciences. All rights reserved.
Bookshelf ID: NBK565935PMID: 33351400DOI: 10.17226/26021


  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (124K)

Related information

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...