Social media and search platforms are in widespread use and while they are used by many to promote evidence-based information, they also allow misinformation to spread quickly – especially during the COVID-19 pandemic. In a panel at the 2021 Annual Research Meeting (ARM), moderator Dr. David Lazer of Northeastern University, panelists Dr. Olveen Carraquillo, Dr. Aaron Carroll, Dr. Wen-Ying Sylvia Chou, and Dr. Garth Graham discussed strategies to curb misinformation.

Misinformation or “Misinfodemics” During COVID-19

Misinformation has become a central theme for the field of health services research (HSR) as it has been quite consequential during the pandemic. Social media has created a “misinfodemic” online, meaning the ubiquitous health information available online has created information silos and echo chambers that enable users to gravitate towards likeminded ideas. This is particularly challenging because, as Chou said, “Falsehoods spread father than truths and garner more engagement, whereas credible science information is more complex, evolving, and uncertain.” The field of HSR has to contend with a rise of distrust in science, experts, and institutions.

The COVID-19 pandemic has highlighted the rise of misinformation on social media, not just in terms of vaccine hesitancy, but also the initial denial of the pandemic which lead to conspiracy theories, opposition to masks and other public health policies, misinformation about scientists and health professionals, unproven treatments (such as bleach), and misinformation about the safety of the vaccine.

Social media platforms themselves have a responsibility to catch incorrect content, however, Graham highlighted that simply removing inaccurate posts doesn’t work. It must be replaced with the correct information.

Accurate Messages from Trusted Voices

Combatting misinformation is difficult and changing people’s minds is hard, Aaron Carroll noted. Panelists agreed that addressing this issue takes real time and effort.

“It’s not a tweet that’s changed someone’s mind. It’s a trusted source that tweets and people listen to the trusted source, whether it’s correct or not,” Carroll said.

Catchy sayings or brief messages to address misinformation often have the opposite effect of entrenching people in their misbeliefs. But Carroll noted that social media can be used to drive people to good content, such as evidence-based articles and individual experts, rather than media outlets.

Panelists agreed that we need a robust system in place to combat misinformation. From health care providers and organizations to journalists, schools, and community organizations, all need to have consistent messaging across different channels, build transparency and trust, and prioritize health equity. Chou highlighted other ways to combat misinformation by:

  • improving surveillance of platforms,
  • focusing on consequences,
  • being aware of emotional components and cognitive bias, and
  • creating innovative interventions.

These efforts need to move beyond fact-checking and include a multi-level and systemic approach to address misinformation.

Carroll offered another method for combatting misinformation: members of the field can add their own voice to the conversation by becoming a reliable and trusted source for the media. Journalists need good research and science and want to get the correct information out there. For more information on working with the media, see this recent blog post.

Using Community Coalitions and Partnerships to Build Trust

Olveen Carrasquillo spoke about his experience with community outreach efforts to combat vaccine hesitancy during COVID-19. He worked with partners and community leaders, including universities, health organizations, who formed a state-wide coalition to increase vaccine acceptance. The media and social media had caused distrust and the people in his community responded better to information from trusted leaders.

He noted that those who perpetuate false health information often use medical terms to sound convincing and considerable factual information is mixed in with the inaccuracies, which spread on social media. He discussed how in the wealthy area of Fischer Island, Florida, residents got vaccinated but minority employees did not. To address this disparity, health care workers provided vaccine shots on site and experts talked with the employees in small groups, over 50 percent received the vaccine that day. This experience highlights how people need access to local, trusted leaders outside of the media.

Public health also needs to be engaged with the technological solutions. Graham, expanding on his work at YouTube and Google Health, noted that COVID-19 underscored the important role YouTube can play communicating public health information to the world. He said there have been more than 100 million views of videos related to COVID-19 vaccines. However, more than 850,000 videos had to be removed as a result of COVID-19 misinformation. To combat this, YouTube worked with the Kaiser Family Foundation to create content from Black clinicians to Black communities, in addition to tailored content for Latinx and rural communities. These partnerships resulted in content that was responsive to the information needs of target audiences. Graham concluded his presentation noting that, “misinformation spreads when people have questions, and science needs to be able to show up with answers."

Christina Tudor Headshot

Christina Tudor

Digital Communications Manager - AcademyHealth

Christina Tudor is the Digital Communications Manager at AcademyHealth. Read Bio

Blog comments are restricted to AcademyHealth members only. To add comments, please sign-in.