Content Moderation

By Priscilla Ruiz (Digital Rights Legal Coordinator, ARTICLE 19 Mexico & Central America Office)

Different organizations in Mexico and Latin America have been witnesses and critics on how content protected by freedom of expression has been removed illegitimately with the purpose to evade public criticism on their own acting. Although freedom of expression is not absolute and establishes exceptions such as child pornography, war propaganda and hate speech that include incitement to violence as well as public and direct incitement to genocide, these States have taken back these criteria with the goal to remove and eliminate illegal content established by courts that have resolved cases related to freedom of expression as a human right.

However, content moderation [1] has been inserted in a complex and multidimensional system. This system involves [2], in the first place, actions and decisions related to the distribution of content based on rules set in the terms and conditions as well as community norms for services provided by private enterprises in a digital environment that start to mold the public sphere. In second place, regarding the creation and implementation of laws [3] that intend to regulate the same regulatory models generated by the States with the aim to control the environment in which content moderation takes place.

Over two years it has been revealed how this illegitimate practice occurs. On one hand, it occurs when social media platforms suppress content through their terms and conditions or norms, which some of them are incompatible with international human rights standards [4]. On the other hand, in situations that involve governments, local law or internal mechanisms, social media platforms are used to request the elimination of content without any type of previous judicial control to determine the origin of the request, leaving anyone who tried to exercise his or her freedom of expression in a complete state of defenselessness in the digital environment.  The above produces a blockage, destruction or elimination of images, journalist articles, videos or audios of any type; ergo, an illegitimate interruption of the right to information and freedom of expression without taking into account the application of the corresponding legal mechanisms, obviating the ponderation study or without exhausting the application of a proportionality test regarding freedom of expression and copyrights [5].

It is relevant to note that, as a direct effect of content moderation on digital media platforms, arises an inhibitory effect whenever actions like content removal [6] do not let the user fully comprehend the legal reasons and bases for which their content has been removed. It is important to establish that arbitrary content removal not only violates freedom of expression derived from citizen and journalist but it also violates their own copyright publications. These rights are conceived the moment an oeuvre is created through the manifestation of ideas and the expression of authorship in a journalistic note referring to a relevant news issue, and as such, violating moral and economic rights to whoever gets his or her content removed is limiting their own copyright protection. [7]

When State actors or private entities like web hosting services or social media platforms remove content without considering the international human rights standards are restricting freedom of speech in a digital environment, limiting and restricting the flow of information of public interest a democratic society receives. Unfortunately, and against international freedom of expression principles, in the past years the terms and conditions of services, politics and community norms of social media platforms, audio visual channels and web hosting services, these have roughened their internal regulation with the justification of protecting the authors interests referring to intellectual property end even rights related to privacy and personal data.

The extense and ambiguous internal normativity of social media platforms and web hosting services, as well as the intransigent application that has pushed users to have null knowledge about where or how to defend their digital rights against arbitrary mechanisms that go against international human rights standards.

In the last years, organizations that defend digital rights, like ARTICLE 19, have known about journalists and activists whose important informative labor has been suppressed from the internet through a mechanism known as “Notice and Takedown”. Such are the cases in which the Digital Millennium Copyright Act or DMCA against the media (applied against Pedro Canché) and other diverse news journals of San Luis Potosí which ended up in content removal that supposedly infringed copyright of others. In many of these cases, the removed content was owned by the publisher who received the removal notice, which suppressed his or her freedom of expression not only as an author, but also as a citizen.

These notifications were pursuant to American copyright law, although almost none of the journalists were under the assumption of infringing violations of copyright established on the DMCA. Unfortunately, this law affects in a disproportionate way because it allows the owners of copyrights to send a simple notice to the service provider when the content had been published without their consent, so the content could be removed immediately and on occasions without going through a legal content revision process but through an automatized system.

This mechanism concludes with internal decisions that social media platforms and hosting service providers take through other mechanisms like “Notice and Takedown” that far from protecting copyright, acts in an illegitimate, arbitrary way and against international human right standards. The decisions that digital platforms make lack jurisdictional mechanisms that begin with formal and material suppositions of admissibility and origin. The above includes (i) the right to due process as well as transparency, (ii) the application of adequate precautionary measures and (iii) proportionality to decisions and the possibility to appeal the decisions made by digital platforms. However, users do not have access to these procedures and this takes us to the next level, which is to ask ourselves: should we let the state regulate this mechanism or should we let digital platforms use automatic regulation under a human rights spectrum? This is the beginning of a wide discussion where multiple stakeholders should participate.

[1] Content moderation “refers to the decisions and actions that companies take to determine what information is allowed on their platforms […] in the event that they determine that any content violates their laws, community regulations, policies and / or terms and conditions, you can resort to the removal of content, the blocking of user accounts or another measure ”. United Nations, General Assembly, Human Rights Council, Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, April 6, 2018, p.3,

[2] “Content moderation is a tool used in order to exclude or prevent access to specific content within a platform or web service. This prerogative is usually used in a discretionary way by companies through their policies of use or terms of service, but sometimes they are also imposed by law and, in some cases, they are exercised due to pressure from governments ”. Guerrero, Carlos, “Content Moderation Policies”, Hyperright, February 25, 2020.

[3] Center for Studies on Freedom of Expression and Access to Information (CELE) and The Initiative for Freedom of Expression on the Internet, The keys of the housekeeper: the strategy of internet intermediaries and the impact on the digital environment,

[4] ARTICLE19. “Facebook community standards”, June 2018,

[5] Article 19 Oficina México y Centroamérica, #LibertadNoDisponible: Censura y remoción… ob. Cit.

[6] The removal of content is understood as the practice to eliminate or restrict the circulation of information on the Internet, using legal frameworks and private mechanisms that limit its access and that are used in an illegitimate and irresponsible way to censor information of public interest that must circulate and remain accessible. This practice originates through three main channels: (i) The first, through requests to remove content from digital platforms, made by the State, persons of public interest or private actors; (ii) The second, through the direct communication of these same actors with journalists or the media to demand the elimination of content on their websites or social media pages; (iii) The third, through the application of the terms and conditions of the service or community regulations of the digital platforms.

[7] The Office of the Special Rapporteur for Freedom of Expression of the Inter-American Commission on Human Rights in its report entitled Standards for a free, open and inclusive Internet, has determined that the “removal of Internet content has an evident impact on the right to freedom of expression, both in its individual and social dimensions, and in the public’s right of access to information. The information removed does not circulate, which affects the right of people to express themselves and disseminate their opinions and ideas and the right of the community to receive information and ideas of all kinds ”. Office of the Special Rapporteur for Freedom of Expression, “Standards for a free, open and inclusive Internet”, Inter-American Commission on Human Rights, March 15, 2017, paragraph 133, p. 54.

Catalysts for Collaboration is looking for a communications intern

The Catalysts for Collaboration project is looking for an intern to help manage our social media and website, initially from the beginning of March until summer. The position is for a few hours per week, paid, and can be done remotely on a flexible schedule.

Primary responsibilities include: managing C for C’s social media, updating our website and blog, and developing communication materials as needed with a view to creating and nurturing an online community around the project, developing collaborations with other relevant actors and raising awareness about strategic litigation in campaigns for social change.

We’re looking for someone with strong independent working skills, who is creative and passionate about human rights. And if you’re also into strategic litigation, we consider that a bonus!

If you are interested, please send your CV with a brief statement of motivation to info [at] catalystsforcollaboration [dot] org with “Communications Intern” in the subject line.

We look forward to hearing from you!

Catalysts for Collaboration at FIFAfrica 2020: relaunching best practices for digital rights litigation

Relaunch of Catalysts for Collaboration at FIFAfrica in September 2020

By Shreya Tewari

In September 2020, the Catalysts for Collaboration project was showcased and relaunched at the Forum on Internet Freedom in Africa, 2020. The collaboration-oriented model that the project introduces has been used worldwide to help develop better practices for strategic litigation. 

For the relaunch of the project at FIFAfrica 2020, we had a line-up of excellent panellists joining us from across continents, who had been a part of the project in its different stages.

The catalysts were first developed in 2017 by Nani Jansen Reventlow during her time as a Fellow at Harvard University’s Berkman Klein Center. They embody best practices for collaboration in strategic litigation such as ‘Planning as a group’, ‘Embracing the tech’ and ‘Involving your stakeholders’, among others. They were the result of multi-stakeholder meetings held with litigators from various jurisdictions. The catalysts were then complemented by case studies of strategic litigation efforts whose outcomes were heavily influenced by collaboration and where the catalysts had organically emerged as the tools that were used for it. 

In September 2020, the project was relaunched in four languages and with two additional case studies from Kenya and Spain For the relaunch of the project at FIFAfrica 2020, we had a line-up of excellent panellists joining us from across continents, who had been a part of the project in its different stages. Mason Kortz, clinical instructor at Harvard Law School’s Cyberlaw Clinic, was instrumental in helping shape the catalysts themselves. Wakesho Kililo, an independent lawyer at the High Court of Kenya, who was a featured actor in the Kenyan Biometric case study; Juan Carlos Lara,  Research and Public Policy Director at Derechos Digitales, who was a critical part of the Surveillance Balloons in Chile case study, and Edrine Wanyama, Legal Officer at CIPESA, who had used the catalysts to develop and analyse case studies in Africa. I had the pleasure of co-moderating the session along with Jansen Reventlow. 

The session opened with Kortz discussing how the catalysts had taken shape. He spoke of how the original concepts that Jansen Reventlow put together were vetted and mulled over by fifteen stakeholders from different disciplines and culminated into 60-70 discrete pieces of advice that people wanted to convey to practitioners. The next step was to distil these pieces of advice into twelve catalysts that represented the combined knowledge of lawyers, activists, academics, and technologists around the world who produced these common themes for collaborations. 

Next, I had the pleasure of speaking about my experience developing the case studies. The most fascinating aspect of this for me was observing that each actor had a different story of how a catalyst had enabled successful litigation and the multi-jurisdictional and inter-disciplinary litigation projects that they have been used in. This shed light on how the catalysts were far from scientific principles that had to be uniformly applied to scenarios to reach pre-determined outcomes. Instead, they were unique action verbs that depicted and focused on more than just an outcome – they depicted how each unique journey can benefit from collaboration. 

Each actor had a different story of how a catalyst had enabled successful litigation and the multi-jurisdictional and inter-disciplinary litigation projects that they have been used in.

Kililo shed light on the advantages of planning as a group in Kenya’s Biometric ID Case and how that allowed the team to harness the resources and expertise of each member in the team. She also laid emphasis on the strategic victories in the litigation by noting how it made way for Kenya’s first-ever Data Protection Bill. She concluded by drawing attention to the catalyst of involving the stakeholders and how that enabled public discourse on important issues leading to larger social change. 

Lara spoke of his experience of learning to collaborate at the helm of Derechos Digitales in its first strategic litigation. He spoke of the need for constant collaboration at every stage of the litigation and even more so, due to the opaque nature of the technology being used by the government when deploying surveillance balloons over municipalities in Santiago. He emphasised how organisations with different areas of expertise came together to get the media involved to generate a larger debate on the issue.  Most importantly, he noted that despite their loss in court, the momentum of the larger debate was used to move the conversation forward and motivated much-needed regulation and policy change. 

Finally, we heard Wanyama talk about his own experience of using the catalysts to develop cases studies in Africa. He noted that his objective behind the analysis was to identify instances where the catalysts had been unknowingly employed in the litigation. Like Lara, he also emphasised how the outcome of a case may not be what was anticipated. However, importantly, he noted that in strategic litigation, just the decision to initiate a case is a move enough to show that the cause was something worth fighting for. 

In strategic litigation, just the decision to initiate a case is a move enough to show that the cause was something worth fighting for.

As the session drew to a close, we learned more about the panellists’ most-cherished catalysts, where catalyst five was a very popular one: respect and acknowledge your collaborators. In the end, the panellists and audience went back feeling more hopeful and equipped to push for broader social change through successful collaborations. 

Relaunch: Catalysts for Collaboration, a resource to advance digital rights through collaborative strategic litigation

Catalysts for Collaboration: Advancing collaboration in digital rights campaigns. Image by Daniel Cuckier under CC BY-ND 2.0.

More and more countries are adopting legislation that threatens human rights online. Four years after its original launch in 2016, the Catalysts for Collaboration website has been relaunched, encouraging internet activists to collaborate across disciplinary silos and strengthen their digital rights litigation in four languages: English, French, Spanish and Russian.

When litigators, technologists, activists and academics collaborate on strategic cases, they are likely to be more effective, more creative and more resilient: the whole is greater than the sum of its parts.

The Catalysts for Collaboration website offers guidance, best practices, and case studies to encourage those working for digital rights to collaborate across disciplinary silos.

Catalysts for Collaboration

The result of a project initiated by Nani Jansen Reventlow when she was a fellow at the Berkman Klein Center for Internet & Society at Harvard University, the website offers best practices — “Catalysts for Collaboration” — combined with a set of case studies to illustrate how lawyers, technologists, activists and academics can work together on strategic litigation for digital rights.

Litigation is an effective tool that can assist in removing restrictions on our human rights. Yet, it is often under-utilized because of a lack of effective collaboration between different actors: lawyers, activists, academics and technical experts.

Even when collaboration would seem obvious and mutually beneficial, these different actors tend to operate in silos, both within their disciplines and geographically. Where there is an interest and willingness to work together, people hesitate because they do not know where to start.

The 12 Catalysts presented on the website — based on a comprehensive set of best practices co-written by a group of experts in different disciplines — offer practical suggestions for those seeking to advance digital rights, ranging from the need to plan as a group to coordinating communication strategies.

The case studies –– featuring litigation work from Africa, Asia, Europe, and the Americas – illustrate best practices used in strategic litigation in human rights cases — both in the digital context and outside of it — and offer valuable lessons learned. For example, the work of the Electronic Frontier Foundation to protect the privacy of Google Books users showcases the importance of embracing technology, while litigationbrought bytheCenter for Constitutional Rights successfully challenged the stop and frisk policies of New York City Police, illustrating the importance of involving all stakeholders in social change processes. Other lessons learned include the need to ensure a good gender balance in collaborative efforts, and that losing a case in court does not mean that it cannot benefit the overall cause being pursued.

As the technology that has an impact on our fundamental rights grows more complex, experts in all disciplines and across geographical boundaries will need to better work together to protect our human rights. The Catalysts for Collaboration support that effort.

“Litigation can be a crucial lever in bringing about change and protect our human rights, including in the digital context,” Jansen Reventlow commented. “We hope that the Catalysts for Collaboration will encourage activists to reflect on how litigation can assist in pushing their agenda, and that the best practices and case studies can inspire them in adopting effective working methods.”

She added: “Having this resource available in different languages will also further enable us to learn from examples in other regions and start thinking about a truly global movement to safeguard our human rights online.”

More case studies will be added in the coming months, highlighting collaborative strategic litigation projects from amongst others Russia and the Philippines.