Inaccurate Content: The Impact of Media Misinformation on Public Health Accessibility

This policy brief evaluates potential solutions for balancing harmful misinformation and maintaining access to public health-related resources through the media.

Published by

 on 

March 22, 2025

Inquiry-driven, this project may reflect personal views, aiming to enrich problem-related discourse.

HeadingHeading 3

Card Title

Lorem ipsum dolor sit amet conse adipiscing elit

Card Title

Lorem ipsum dolor sit amet conse adipiscing elit

Card Title

Lorem ipsum dolor sit amet conse adipiscing elit

Card Title

Lorem ipsum dolor sit amet conse adipiscing elit

Support

Executive Summary


The rise of social media and its influence have been daunting topics in the United States, and on a larger scale, the world. Users increasingly use social media and other tools like artificial intelligence (AI) as search engines. This brief will cover how limitations on these platforms pose a concern in preventing information, specifically on public health issues, from reaching different audiences.

Overview


Recently, digital tools outside of Google have begun to be used as search outlets for individuals to find information. This new accessibility has started to raise prevalent concerns over misinformation. This is important, considering misinformation can include skewed, biased, or incomplete results for viewers. Specifically, in a public health context, recent restrictions within the United States on platforms such as TikTok threaten accessibility to easily accessible health information.
between reducing harmful misinformation and maintaining access to beneficial health resources that communities may need. As a result, this issue sparks political debate and disagreement.

Relevance

Many adults in the United States have begun using social media to seek health-related information. A study by Jim P. Stimpson and Alexander N. Ortega found that 67% of social media users could not tell whether or not the health information they saw was accurate. According to Xun Wang and Robin A. Cohen, who work under the Centers for Disease Control and Prevention (CDC) showed that in 2023, about 55% of adults seek information on health issues through the Internet.

As internet use for information grows, many users struggle to determine the accuracy of what they find. These concerns also relate to the issue of “Shadow Banning,” which is a term that encompasses the concept of a platform, or overhead restriction of information from users or even preventing a user from disclosing content. Furthermore, the COVID-19 Pandemic is a vital example of online misinformation within the media. Notably, misinformation about vaccines
surged, including claims linking vaccination to autism–despite scientific disproof.

History

A. Current Stances


Since the early 2000s, social media has been a global tool for sharing personal updates and news. YouTube and Facebook are key examples of how these platforms have influenced health-related campaigns. Now, more current platforms like
Instagram and TikTok coexist with these past
ones to deliver crucial news-related information to individuals.
Nevertheless, social media freedom can harm users. The COVID-19 pandemic brought
attention to this, as groups such as the World Health Organization (WHO) highlighted the dangers of platforms not filtering out and
responding to false information about the virus.
As a result, these companies balance public
support with avoiding government control. The pandemic also highlighted how individuals could be harmed by believing information received online without verifying a source for the content.
Currently, many platforms have had specific policies that they must abide by. Facebook or Meta’s policy concludes that they prohibit
information that involves physical harm or violence, harmful health misinformation, voter or census interference, and even manipulated media (use of artificial intelligence). Similarly, TikTok uses a global database to check the accuracy of content, aiming to protect users, while
maintaining an international perspective.

Policy Problem

A. Stakeholders


Many key stakeholders are involved in the impact of misinformation and restrictions on social media platforms about public health.
One of the more apparent stakeholders is the general public, specifically those who use social media. Users who use these platforms to seek health-related information online are more vulnerable to misinformation or lack thereof, especially considering that, according to the CDC, 55% of adults in the U.S. seek health information online.


Other stakeholders involved are healthcare workers, experts, professionals, and organizations. Organizations such as the CDC, WHO, health departments, and more leverage the ease of social media to spread health-related information to benefit the public. This is especially important during emergencies such as COVID-19 and, more recently, in alarming potential pandemic spreads, level 2 of Ebola in Uganda and Mpox in East Africa.
Social media platforms play a significant role in determining what is at risk. Meta, TikTok, X, Instagram, and others have expressed their shared interest in the moderation of information in ways that balance freedom of expression while preventing harmful information.
Government-related agencies, including the Federal Trade Commission (FTC) and the Department of Health and Human Services (HHS), are also stakeholders. With Congress, the FTC works to prevent unfair business practices. The FTC can hold them accountable for endorsing disinformation because social media is a form of transactional and business relationship. Lastly, the HHS acts as an oversight for public health programs within the U.S. The HHS works to prevent misinformation by trying to distribute accurate information and working with social media platforms to ensure reliable information. While the department cannot regulate the information, its involvement with the platforms helps ensure misinformation is not spread.

B. Risks of Indifference


Ignoring misinformation’s the impact on public health is a significant risk. It can increase the risk of infection, especially during a public health crisis, as seen during COVID-19. Delays caused by media-influenced beliefs increase disease spread and preventable death. If accurate
information is not received, it can increase the health disparity gap. Additionally, public trust is beginning to erode; a survey found that 34% of social media users exposed to misinformation believed that the COVID-19 vaccine contributed to the number of deaths. A 2021 study on COVID-19 estimated that
misinformation-related health costs ranged between $50 million to $300 million daily due to hospitalizations, delayed treatments, and hesitant vaccinations.

C. Nonpartisan Reasoning


Because the overall health of a society does not just impact a subgroup or population, it is key
that nonpartisan intervention happens because it affects larger communities.

  1. Economic impact: Health systems' costs
    due to misinformation in the context of COVID-19 are detrimental. This means that the value of mortality and morbidity results was around $60 million of the total
    $1 billion per day loss from the healthcare system and loss from individuals missing work. It is especially concerning that misinformation alone caused between $50 million and $300 million in harm in costs from what was deemed preventable because of access to inaccurate information.
  2. National security: In 2022, the U.S. Department of Homeland Security pointed out the threat that misinformation presents to the homeland and security of the United States of America, pointing out that misinformation does not just impact the individual.
  3. Community impact: Regardless of exposure to misinformation, especially during the pandemic on vaccination, the number of deaths (preventable or not) will always impact a community. Hence the need for nonpartisan interventions. Following the ideas from the economic impact of losing many individuals within a population, poses a financial loss (workforce, consumers, etc.) within a society. This is especially important because communities with higher health literacy backgrounds have higher rates of better outcomes in health crises.

Tried Policy


Different approaches have been used to address misinformation about public health. The U.S. Surgeon General Advisory in July of 2021 was the first federal public acknowledgment of the threat that misinformation poses to the integrity of public health. Specifically with the rejection of vaccines to prevent susceptibility to COVID-19 due to the inability to judge what to believe fully.
The Digital Literacy Act of 2022 was also introduced as the Digital Citizenship and Media Literacy Act, which suggested putting around
$20 million for grants that encouraged media literacy being taught within K-12 schools, but was never enacted and agreed upon as a law for states to follow. The bill did gain initial bipartisan support, but concerns regarding federal overreach and implementation stalled its progress.
Meta is an example of a social media platform that implements specific instructions for removing misleading information from its platform. However, as of January 7th, Meta announced that they would end their fact-checking program due to protecting the conception and right of free speech within the United States. These efforts highlight the need for stronger measures to ensure accurate and accessible health information.

Policy Options


Attempting to Reintroduce the Digital Literacy Act
While it did not see success when it was initially introduced, media has changed and advanced
with more challenges, such as AI. Considering the bill previously, even receiving bipartisan support, then maintaining the original language while including more technological threats seen in 2025, its likelihood of being passed is more apparent. Additionally, rather than outwardly limiting inaccurate information online, which can be deemed as taking away free speech, it would focus on developing those aged between grades K-12 to understand social media, specifically health information received. This can be through education and having pediatrics within health care do a similar short screening as they do with questions concerning minors’ mental health. This has many implementation challenges, including creating a standardized curriculum that is politically bipartisan. As well as training teachers across different school districts, with other resources, and measuring the effectiveness of the programs. Lastly, educational authorities would need extensive time to implement and enforce these regulations within every publicly funded school across the U.S.

Utilizing the EU Digital Services Act as a Model for U.S. Public Health Information
This policy would use key components of the European Union’s Digital Service Act (DSA), particularly maintaining more transparency online and removing illegal information, including disinformation. However, implementing this would be more challenging, considering U.S. speech protections from the First Amendment are more substantial than EU speech protections. Finding ways to define misinformation versus protected speech would
also have to be communicated.

Platform Accountability Measures
While the backlash against social media platforms does not have measures against misinformation, more drastic implications, such as legitimate punishment, should be utilized to get social media platforms to conform whether it is through warning systems that become financial consequences so that platforms (specifically, their owners have an incentive to follow these measures). Or official oversight enforcement through government enforcement agencies like the Office of Science and Technology Policy (OSTP) or even the U.S. State Department’s Bureau of Diplomatic Technology through technological buffers that ensure the flagged information poses a threat to the well-being of a society. Nevertheless, implementing this approach poses challenges, such as strategies of enforcement that do not block legitimate content and consistent enforcement across all political administrations.

Conclusions


Broader implications need to be considered as we move into our future. The new governmental administration must consider the threat of the spread of new diseases, such as the bird flu; readily available access to this information is crucial. I recommend reintroducing the Digital Literacy Act to address media-based misinformation. This bipartisan solution focuses on the children–or potential individuals–who will dictate the future. It also avoids dismantling “Free speech,” a value dictated by the United States. Reducing the proposed funding from $20 million to $15 million could improve the policy’s feasibility. Additionally, it would be voluntary for states. Ultimately ensuring accurate and accessible information is not just a necessity, but a must for the future of public health. Without taking measures, the consequences will continue to destroy trust and delay the urgency to respond to critical health issues.

References

[1] Stimpson, J. P., & Ortega, A. N. (2023). Social media users' perceptions about health mis and dis information on social media. Health Affairs Scholar, 1(4). https://doi.org/10.1093/haschl/qxad050.
[2] Wang, X., & Cohen, R. A. (2023). Health information technology use among adults: United States, July–December 2022. (NCHS Data Brief, No 482). National Center for Health Statistics. https://dx.doi.org/10.15620/cdc:133700.
[3] Scannell, D., Desens, L., Guadagno, M., Tra, Y., Acker, E., Sheridan, K., & Fulk, M. (2021). COVID-19 Vaccine Discourse on Twitter: A Content Analysis of Persuasion Techniques, Sentiment and Mis/Dis information. Journal of Health Communication, 26(7), 443–459. https://doi.org/10.1080/10810730.2021.1955050
[4] Baker, S. A., Wade, M., & Walsh, M. J. (2020). The challenges of responding to mis information during a pandemic: Content moderation and the limitations of the concept of harm. Media International Australia, 177(1), 103–107. https://doi.org/10.1177/1329878X20951301
[5] Meta. (n.d.). Mis information. https://transparency.meta.com/policies/community-standards/misinformation/
[6] Centers for Disease Control and Prevention. (n.d.). Clade I mpox in central and eastern Africa - level 2 - practice enhanced precautions - travel health notices. Centers for Disease Control and Prevention. https://wwwnc.cdc.gov/travel/notices/level2/clade-1-mpox-central-eastern-africa
[7] Centers for Disease Control and Prevention. (n.d.-b). Ebola in Uganda - level 2 - practice enhanced precautions - travel health notices. Centers for Disease Control and Prevention. https://wwwnc.cdc.gov/travel/notices/level2/ebola-uganda
[8] Moorhead, S. A., Hazlett, D. E., Harrison, L., Carroll, J. K., Irwin, A., & Hoving, C. (2013). A New Dimension of Health Care: Systematic Review of the Uses, Benefits, and Limitations of Social Media for Health Communication. Journal of Medical Internet Research, 15(4). https://doi.org/10.2196/jmir.1933
[9] Federal Communications Commission. What We Do. https://www.fcc.gov/about-fcc/what-we-do
[10] Johnson, T. M. (n.d.). The FCC’s authority to interpret Section 230 of the Communications Act. Federal Communications Commission. https://www.fcc.gov/news-events/blog/2020/10/21/fccs-authority-interpret-section-230-communications-act
[11] Federal Trade Commission. What the FTC does. https://www.ftc.gov/news-events/media-resources/what-ftc-does
[12] U.S. Department of Health and Human Services. (2021). Confronting health mis information - the U.S. Surgeon General’s advisory on building a healthy information environment. https://www.hhs.gov/surgeongeneral/reports-and-publications/health-misinformation/index.html
[13] Suarez-Lledo, V., & Alvarez-Galvez, J. (2021). Prevalence of Health Mis information on Social Media: Systematic Review. Journal of Medical Internet Research, 23(1). https://doi.org/10.2196/17187
[14] National Academies of Sciences, Engineering, and Medicine. (2020). Addressing Health Mis information with Health Literacy Strategies: Proceedings of a Workshop—in Brief. The National Academies Press. https://doi.org/10.17226/26021
[15] Broniatowski, D. A., Jamison, A. M., Qi, S., AlKulaib, L., Chen, T., Benton, A., Quinn, S. C., & Dredze, M. (2018). Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate. American Journal of Public Health, 108. https://doi.org/10.2105/AJPH.2018.304567
[16] Lopes, L., Kearney, A., Washington, I., Valdes, I., Yilma, H., & Hamel, L. (2023, August 22). KFF Health Mis information Tracking Poll Pilot. KFF. https://www.kff.org/health-information-and-trust/poll-finding/kff-health-misinformation-tracking-poll-pilot/
[17] Burns, R., Hosangadi, D., Trotochaud, M., & Sell, T. K. (n.d.). Mis information and Public Health: The Impact of False Information on Response Efforts. Johns Hopkins Center for Health Security. https://centerforhealthsecurity.org/sites/default/files/2023-02/210322-misinformation.pdf
[18] Homeland Security Advisory Council. (n.d.-b). Final Report on Dis information. U.S. Department of Homeland Security. https://www.dhs.gov/sites/default/files/2022-08/22_0824_ope_hsac-disinformation-subcommittee-final-report-08242022.pdf
[19] Ratzan, S. C. (2011). Health Literacy: Building Upon a Strong Foundation. Journal of Health Communication. https://doi.org/10.1080/10810730.2011.606071
[20] U.S. Department of Health and Human Services. (2021). Confronting Health Mis information: The U.S. Surgeon General’s Advisory on Building a Healthy Information Environment. https://www.hhs.gov/surgeongeneral/reports-and-publications/health-misinformation/index.html
[21] Congress.gov. (2024, May 22). Investing in Digital Skills Act. https://www.congress.gov/bill/118th-congress/senate-bill/4391/text
[22] Gibson, K. (2025, January 8). Mark Zuckerberg says ending fact-checks will curb censorship. Fact-checkers say he’s wrong. CBS News. https://www.cbsnews.com/news/meta-mark-zuckerberg-facebook-fact-checkers-censorship/
[23] European Commission. (n.d.). The Digital Services Act Package: Shaping Europe’s Digital Future. https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package
[24] Office of Science and Technology Policy. (n.d.). Office of Science and Technology Policy (OSTP): Usagov. https://www.usa.gov/agencies/office-of-science-and-technology-policy
[25] U.S. Department of State. (n.d.). Bureau of Diplomatic Technology. https://www.state.gov/bureaus-offices/under-secretary-for-management/bureau-of-diplomatic-technology
[26] Centers for Disease Control and Prevention. (n.d.). Bird Flu. https://www.cdc.gov/bird-flu/index.html

Christiane Morton

2025 Winter Fellow

Christiane Morton is a student at George Washington University in Washington, D.C., studying International Affairs with a Concentration in Global Public Health.

Author's Profile