MODERATE CAUTIOUSLY

Minimize your control over user expression.

If your product provides a forum for content or communication, consider carefully the far-reaching impact of any content moderation policies or decisions, consider carefully whether you want to be in the business of policing those forums. Following the Santa Clara Principles, any policies addressing illegal or other harmful behavior should be as clear and narrow as possible, be transparent to users, and be applied in an equitable manner. Any policies and mechanisms must be regularly evaluated to address civil liberties and civil rights impact on users, particularly users whose voices have historically been marginalized.

FOUNDATIONAL PRINCIPLES

1. Human Rights and Due Process. Ensure that human rights and due process considerations are integrated at all stages of content moderation, and users should have clear and accessible methods of obtaining support when action is taken against their content or account.

2. Understandable Rules and Policies. Publish clear and precise rules and policies around when action will be taken on users’ content or accounts.

3. Cultural Competence. Ensure that their rules and policies, and their enforcement, take into consideration the diversity of cultures and contexts in which their platforms and services are available and used.

4. State Involvement in Content Moderation. Recognize and address the special concerns that are raised by the involvement of the state in the development of company policies and requests to remove or suspend content and accounts.

5. Integrity and Explainability. Ensure that content moderation systems work reliably and effectively by pursuing accuracy and nondiscrimination in detection methods, submitting to regular assessments, and equitably providing notice and appeal mechanisms.

OPERATIONAL PRINCIPLES

1. Numbers. Publish information about pieces of content and accounts actioned, broken down by country or region, if available, and category of rule violated.

2. Notice. Provide notice to each user whose content is removed, account is suspended, or when some other action is taken due to non-compliance with the service’s rules and policies, including the reason for the removal, suspension or action.

3. Appeal. Provide a meaningful opportunity for timely appeal of decisions to remove content, keep content up which had been flagged, suspend an account, or take any other type of action affecting users’ human rights, including the right to freedom of expression.

Share This: