Case Study
Facebook “race blind” content moderation drives away Black users
Facebook came under public fire, many Black users left the platform, and the company was hauled before Congress in 2021 for its content moderation practices, which the company’s own researchers found allowed hate speech to proliferate, while disproportionately censoring Black users and people from marginalized groups. Their findings were consistent with criticisms civil rights groups had directed at Facebook for years, including the ACLU of Northern California, who in 2020 urged Facebook to address content moderation on its platform following the erroneous takedowns of content shared by Black activists and LGBTQ artists. Facebook has promised to examine and improve how its platform treats people differently based on race and has created a Civil Rights Team to audit the company’s progress.