Featured article
Caught in the Filter: How Social Media Moderation Silences Marginalized Voices
In
Social Policy
By
Cathy Jiang
Social media platforms claim to protect users from harm–but for many marginalized communities, those same systems are doing the opposite. Researchers and advocacy groups have shown that speech by women, people of color, LGBTQ+ people, and religious minorities are censored or removed disproportionately. In contrast, harassment against these groups is often ignored (Diaz and Hecht-Felella). Content moderation systems on major social media platforms disproportionately silence marginalized voices because of structural flaws in how these systems are designed and enforced. This report examines the historical context, mechanisms, case studies, and reform proposals related to these disparate outcomes.


.jpg)
.jpg)
.jpg)
.jpg)












.jpg)

.jpg)
.jpg)
.jpg)

.jpg)



