Thursday, November 15, 2018

88 Rights Groups Call for Facebook to Implement Appeals Process for Removed Content

The Cartoonists Rights Network International has signed on to the below letter citing:

"Too many cartoonists are reporting posts banned by abuse of the existing complaints procedure as well as difficulties caused by algorithmic handling of content."

I would ask that the National Cartoonists Society and the American Association of Editorial Cartoonists consider joining the signees. 

Via the Electronic Frontier Foundation:

Dear Mark Zuckerberg:

What do the Philadelphia Museum of Art, a Danish member of parliament, and a news anchor from the Philippines have in common? They have all been subject to a misapplication of Facebook's Community Standards. But unlike the average user, each of these individuals and entities received media attention, were able to reach Facebook staff and, in some cases, receive an apology and have their content restored. For most users, content that Facebook removes is rarely restored, and some users may be banned from the platform – even in the event of an error.

When Facebook first came onto our screens, users who violated its rules and had their content removed or their account deactivated were sent a message telling them that the decision was final and could not be appealed. It was only in 2011, after years of advocacy from human rights organizations, that your company added a mechanism to appeal account deactivations, and only in 2018 that Facebook initiated a process for remedying wrongful takedowns of certain types of content. Those appeals are available for posts removed for nudity, sexual activity, hate speech or graphic violence.

This is a positive development, but it doesn't go far enough.

We, the undersigned civil society organizations, call on Facebook to provide a mechanism for all of its users to appeal content restrictions, and, in every case, to have the appealed decision re-reviewed by a human moderator.

Facebook's stated mission is to give people the power to build community and bring the world closer together. With more than two billion users and a wide variety of features, Facebook is the world's most-used communications platform. We know that you recognize the responsibility you have to prevent abuse and keep users safe. Social media companies, including Facebook, also have a responsibility to respect human rights. International and regional human rights bodies have a number of specific recommendations for improvements here, notably concerning the right to remedy.

Facebook remains far behind its competitors when it comes to affording its users due process. [1] We know from years of research and documentation that human content moderators, as well as machine learning algorithms, are prone to error, and that even low error rates can result in millions of silenced users when operating at massive scale. Yet Facebook users are only able to appeal content decisions in a limited set of circumstances, and it is impossible for users to know how pervasive erroneous content takedowns are without increased transparency on Facebook's part. [2] Furthermore, civil society groups around the globe have criticized the way that Facebook's Community Standards exhibit bias and are unevenly applied across different languages and cultural contexts.

Earlier this year, a group of advocates and academics put forward the Santa Clara Principles on Transparency and Accountability in Content Moderation, which recommend a set of minimum standards for transparency and meaningful appeal. This set of recommendations is supported by the work of the UN Special Rapporteur on the promotion of the right to freedom of expression and opinion David Kaye, who recently called for a “framework for the moderation of user-generated online content that puts human rights at the very center.” It is also supported by the UN Guiding Principles on Business and Human Rights.

While we acknowledge that Facebook can and does shape its Community Standards according to its values, the company has a responsibility to protect its users' expression to the best of its ability. Offering a remedy mechanism, as well as more transparency, will go a long way toward supporting user expression.

Specifically, we ask Facebook to incorporate the Santa Clara Principles on Transparency and Accountability in Content Moderation into their policies and practices, and to provide:

Notice: Clearly explain to users why their content has been restricted.
  • Notifications should include the specific clause from the Community Standards that the content was found to violate.

  • Notice should be sufficiently detailed to allow the user to identify the specific content that was restricted, and should include information about how the content was detected, evaluated, and removed.

  • Individuals must have clear information about how to appeal the decision.

Appeals: Provide users with a chance to appeal content moderation decisions.
  • The appeals mechanism should be easily accessible and easy to use.

  • Appeals should be subject to review by a person or panel of persons not involved in the initial decision.

  • Users must have the right to propose new evidence or material to be considered in the review.

  • Appeals should result in a prompt determination and reply to the user.

  • Any exceptions to the principle of universal appeals should be clearly disclosed and compatible with international human rights principles.

  • Facebook should collaborate with other stakeholders to develop new independent self-regulatory mechanisms for social media that will provide greater accountability.

Numbers: Issue regular transparency reports on Community Standards enforcement.
  • Present complete data describing the categories of user content that are restricted (text, photo or video; violence, nudity, copyright violations, etc.), as well as the number of pieces of content that were restricted or removed in each category.

  • Incorporate data on how many content moderation actions were initiated by a user flag, a trusted flagger program, or by proactive Community Standard enforcement (such as through the use of a machine learning algorithm).

  • Include data on the number of decisions that were effectively appealed or otherwise found to have been made in error.

  • Include data reflecting whether the company performed any proactive audits of its unappealed moderation decisions, as well as the error rates the company found.

1 See EFF's Who Has Your Back? 2018 Report https://www.eff.org/who-has-your-back-2018, and Ranking Digital Rights Indicator G6, https://rankingdigitalrights.org/index2018/indicators/g6/.
2 See Ranking Digital Rights, Indicators F4 https://rankingdigitalrights.org/index2018/indicators/f4/, and F8, https://rankingdigitalrights.org/index2018/indicators/f8/ and New America's Open Technology Institute, “Transparency Reporting Toolkit: Content Takedown Reporting”, https://www.newamerica.org/oti/reports/transparency-reporting-toolkit-content-takedown-reporting/
3 For example, see Article 19's policy brief, “Self-regulation and 'hate speech' on social media platforms,” https://www.article19.org/wp-content/uploads/2018/03/Self-regulation-and-%E2%80%98hate-speech%E2%80%99-on-social-media-platforms_March2018.pdf.

Electronic Frontier Foundation (EFF)
7amleh - Arab Center for the Advancement of Social Media
Adil Soz - International Foundation for Protection of Freedom of Speech
Africa Freedom of Information Centre (AFIC)
Albanian Media Institute
Americans for Democracy & Human Rights in Bahrain (ADHRB)
ARTICLE 19
Asociación Mundial de Radios Comunitarias América Latina y el Caribe (AMARC ALC)
Association for Freedom of Thought and Expression (AFTE)
Bytes for All (B4A)
Cartoonists Rights Network International (CRNI)
Center for Independent Journalism - Romania
Center for Media Studies & Peace Building (CEMESP)
Child Rights International Network (CRIN)
Committee to Protect Journalists (CPJ)
Digital Rights Foundation
Foro de Periodismo Argentino
Foundation for Press Freedom - FLIP
Freedom Forum
Fundamedios - Andean Foundation for Media Observation and Study
Gulf Centre for Human Rights (GCHR)
Human Rights Watch (HRW)
Independent Journalism Center (IJC)
Initiative for Freedom of Expression - Turkey
International Press Centre (IPC)
MARCH
Mediacentar Sarajevo
Media Institute of Southern Africa (MISA)
Media Rights Agenda (MRA)
OpenMedia
Pacific Islands News Association (PINA)
PEN America
PEN Canada
SFLC.in
Social Media Exchange (SMEX)
Southeast Asian Press Alliance (SEAPA)
South East Europe Media Organisation
Syrian Center for Media and Freedom of Expression (SCM)
Vigilance for Democracy and the Civic State
Visualizing Impact (VI)
ACLU Foundation of Northern California
American Civil Liberties Union (ACLU)
Arab Digital Expression Foundation
Artículo 12
Association for Progressive Communications (APC)
Brennan Center for Justice at NYU School of Law
CAIR San Francisco Bay Area
CALAM
Cedar Rapids, Iowa Collaborators
Center for Democracy and Technology
EFF Austin
El Instituto Panameño de Derecho y Nuevas Tecnologías (IPANDETEC)
Electronic Frontier Finland
Elektronisk Forpost Norge
Fundaciõn Acceso
Fundaciõn Ciudadano Inteligente
Fundaciõn Datos Protegidos
Fundaciõn Internet Bolivia.org
Fundaciõn Vi­a Libre
Garoa Hacker Club
HERMES Center for Transparency and Digital Human Rights
Hiperderecho
Homo Digitalis
Idec - Brazilian Institute of Consumer Defense
Instituto Nupef
Internet Without Borders
Intervozes - Coletivo Brasil de Comunição Social
La Asociaciõn para una Ciudadanía Participativa ACI Participa
May First/People Link
New America's Open Technology Institute
NYC Privacy
Open MIC (Open Media and Information Companies Initiative)
Panoptykon Foundation
Peninsula Peace and Justice Center
Portland TA3M
Privacy Watch
Raging Grannies
Ranking Digital Rights
ReThink LinkNYC
Rhode Island Rights
SHARE Foundation
SumOfUs
Syrian Archive
t4tech
Techactivist.org
Viet Tan
Witness
Xnet

No comments:

Post a Comment