We often hear in the news about particular groups of people who face censorship on social media more than others – for instance, transgender users having their videos censored on TikTok and Black Facebook users barred from discussing their experiences with racism. Content on social media can be moderated by algorithms or real people. Regardless, there is a “grey area” in such efforts in deciding what is “right” or “wrong.” By highlighting which groups are more likely to have content censored on social media sites and why, in this work we take the first steps toward making social media content moderation more equitable.
We want to hear your stories if you believe any of your posts or accounts on social media have incorrectly been taken down. As researchers from the University of Michigan School of Information, we want to uncover the pain points marginalized people face on social media and share out with social media policy managers to help make the internet an equitable and inclusive space.
Disclaimer: OIHC may sometimes reach out to individuals to assist them with any content takedowns on social media, but it is not always guaranteed. OIHC cannot assure any content takedowns or account ban appeals made to social media platforms will be approved. We are not affiliated with any social media platforms which includes but is not limited to: Facebook, Instagram, Twitter, Reddit, Tumblr, YouTube.
OIHC is funded in part by the National Science Foundation and University of Michigan School of Information. Resources on the website were written through a collaboration between our research team at the University of Michigan School of Information, Harvard Law School Cyberlaw Clinic, and Salty.
Team members and collaborators include: Sammy Camy, Daniel Delmonaco, Claire Fitzsimmons, Jessica Fjeld, Oliver Haimson, Landon Harris, Shannon Li, Samuel Mayworm, Christian Paneda, Hibby Thach, and Andrea Wegner.
All illustrations on the OIHC website are credited to Irene Falgueras via blush.design