A sense of community can flourish when users are able to communicate freely. Narrowly tailoring your terms of service or community guidelines to prohibit only illegal content or specific harmful behavior that undermines the purpose of your service will help you limit the time you spend monitoring speech and the risk of being inconsistent or biased in the application of your policies.
Facebook came under public fire, many Black users left the platform, and the company was hauled before Congress in 2021 for its content moderation practices, which the company’s own researchers found allowed hate speech to proliferate, while disproportionately censoring Black users and people from marginalized groups. Their findings were consistent with criticisms civil rights groups had directed at Facebook for years, including the ACLU of Northern California, who in 2020 urged Facebook to address content moderation on its platform following the erroneous takedowns of content shared by Black activists and LGBTQ artists. Facebook has promised to examine and improve how its platform treats people differently based on race and has created a Civil Rights Team to audit the company’s progress.