MODERATE CAUTIOUSLY

Minimize your control over user expression.

If your product provides a forum for content or communication, consider carefully the far-reaching impact of any content moderation policies or decisions, consider carefully whether you want to be in the business of policing those forums. Following the Santa Clara Principles, any policies addressing illegal or other harmful behavior should be as clear and narrow as possible, be transparent to users, and be applied in an equitable manner. Any policies and mechanisms must be regularly evaluated to address civil liberties and civil rights impact on users, particularly users whose voices have historically been marginalized.

BUILD A REPORTING SYSTEM THAT SAFEGUARDS FREE EXPRESSION
Allowing users to report policy violations can help you enforce those policies but can also lead to complaints aimed at silencing innocent users. To ensure you receive accurate information and to help you focus your resources on solving meaningful user concerns, encourage users posting complaints to provide details about their concern.
Case Study

Facebook’s “Fake Name” Reporting Option Enrages Users

Facebook was heavily criticized for maintaining a “fake name” reporting option even as it took steps to move away from its “real name policy.” The system was abused to target legitimate users, particularly members of the LGBT community.

Facebook was heavily criticized for maintaining a “fake name” reporting option even as it took steps to move away from its “real name policy.” The system was abused to target legitimate users, particularly members of the LGBT community. Targeted users and their communities were “furious” at being trapped in Facebook “purgatory” by a system that enabled rather than prevented abuse.

INFORM USERS ABOUT POTENTIAL VIOLATIONS OF YOUR POLICIES
You can help your users better understand your policies and get timely feedback to help you avoid embarrassing mistakes by notifying them when you believe they might have violated your policies. Explain which policies are implicated and any actions you have taken or may take. Communication and transparency helps users better understand your policies and explain their own actions.
Case Study

Reddit’s “Shadowbanning” Criticized for Leaving Users in the Dark

After being targeted by criticism from its own users, reddit was forced to backtrack from its practice of “shadowbanning” users who violated its rules.

After being targeted by criticism from its own users, reddit was forced to backtrack from its practice of “shadowbanning” users who violated its rules. Users who were shadowbanned received no notice and continued to experience reddit in the same manner as before, but none of their actions were visible to other reddit users, resulting in an effective ban from the site. The practice was widely criticized for alienating users, leading reddit to pledge to replace the “confusing” system with a more transparent approach.

KEEP CONTENT AND USER ACCOUNTS ACTIVE
Not every user complaint or automated flag you encounter will actually be a violation of your policies. The best way to avoid angry users and the need to publicly apologize for improperly imposing penalties is to wait until you determine whether one of your rules was actually violated before taking any action.
Case Study

Facebook Criticized for Censoring ACLU Blog Post About Censorship

Facebook found itself in the middle of another controversy around online speech when it deleted a blog post that, ironically enough, was about free speech and censorship.

Facebook found itself in the middle of another controversy around online speech when it deleted a blog post that, ironically enough, was about free speech and censorship. The post, which discussed a "kerfluffle" about a partially-nude statue in a public park in Kansas, included a picture of the statue. When the photo was incorrectly identified as a nude picture in violation of Facebook’s Community Standards, the service not only removed the blog post but also "unpublished" the ACLU's entire Facebook page. The company ultimately backtracked, claiming that both takedowns were mistakes, but not before the media picked up on the controversy and highlighted Facebook’s “history of problems with boobs” and other incidents where it incorrectly or arbitrarily enforced its policies.

GIVE USERS THE RIGHT TO BE HEARD
Give your users a chance to be heard by allowing them to make their case, ideally before you make any decision to remove their content or otherwise affect their standing on your service. You might do this by creating a user-to-user informal dispute resolution channel. At a minimum, ensure that any dispute resolution process grants users the right to appeal decisions. With a robust dispute resolution and appeals process, you can help obtain the information to quickly resolve disputes and reduce the risk of incorrect and unfair decisions, all while demonstrating respect of your users’ views and experience.
Case Study

Medium Tries to Craft “Human and Practical” Policies for Its Platform

In mid-2015, the blogging site Medium introduced a straightforward set of content policies for posts and comments. The policies are designed to accommodate controversial speech, well-organized, and written in plain English.

In mid-2015, the blogging site Medium introduced a straightforward set of content policies for posts and comments. The policies are designed to accommodate controversial speech, well-organized, and written in plain English. Notably, Medium states that users who encounter content that may violate a rule will speak with “a real human being” at the company in order to resolve any disputes, a move designed to facilitate fair outcomes and prevent inappropriate removal of speech from the site. Press praised Medium’s changes as “setting the tone for the community it wants to foster on its site.”

"We believe that counter-speech, is one of the best ways to deal with hate speech. That's the balance that we're trying to strike." - Ross LaJeunesse, Google Global Head of Free Expression & International Relations

Share This: