Consultation Report - 21 Feb 2023 - Paris, France

The second consultation was held during UNESCO’s Internet for Trust Conference in Paris on 21 February 2023, co-organized by GFMD and UCLA’s Institute for Technology, Law and Policy.

Key takeaways

  • In Ukraine and other conflict zones, journalism and media organizations face unique challenges when it comes to content moderation on social media platforms. Social media platforms seem to lack capacity for effectively and promptly dealing with content moderation challenges and media organisations frequently experience delays and a lack of follow-up from platforms after flagging moderation issues. By the time they make decisions, news items become outdated.

  • Some platforms remove graphic content related to war and potential violations of human rights. Media outlets often practice self-censorship to avoid being banned or to prevent their accounts from being suspended. There is a need to ensure that content about war crimes and human rights violations is not lost.

  • Navigating these challenges requires a collaborative effort between social media platforms, media organisations, and local actors to establish clear protocols and support for fair and effective content moderation practices. By understanding key actors and sources of information, companies can better respond to crises and ensure that valuable information is preserved.

  • To effectively address crises, companies should have a cross-functional team that can be work across issues and areas, and are able to work both externally and internally, with the ability to make decisions on both content issues and future directions a crisis may take.

  • Certifying "content" can be challenging. Therefore, the recommended approach is to focus on the account level, e.g. the media outlet itself and its practices. Ethical Journalism Network (EJN) and Journalism Trust Initiative (JTI) recommend ethical audits to media outlets as a self-assessment procedure to evaluate their ethical standards and governance. They also propose defining a ready-to-use framework as soon as media outlets are in a crisis situation (or even better, in advance). A network of nominating sources, such as national press freedom organizations or councils, should be used to nominate media outlets that should be encompassed in this emergency protocol. However, there was also acknowledgement that crises come in many different shapes and sizes, including prolonged, slow burning ones, so that should be addressed in future iterations,

  • Participants advocated the development of an early warning system with a multistakeholder approach. This mechanism should be integrated into the UN Plan of Action on Safety of Journalists and its national plans of action, as the problem directly affects press freedom and the safety of journalists.

  • The mechanism must be data and evidence-driven, especially to support advocacy on social media accountability and facilitate systematic, sustainable collaboration between civil society and tech platforms.

Issues identified

In Ukraine and other conflict zones, journalism and media organisations face unique challenges when it comes to content moderation on social media platforms. The Lviv Media Forum has experienced delays and a lack of follow-up from platforms after flagging moderation issues, indicating a need for clear mechanisms of cooperation and involvement of local actors in responding to content moderation issues.

Some platforms prioritise creating a "joyful, happy environment," leading to the removal of graphic content related to war and potential violations of human rights. While platforms lack the capacity to promptly and effectively deal with content moderation challenges, media outlets often practise self-censorship to avoid being banned or to prevent their accounts from being suspended. Evidence preservation and speech suppression were mentioned by several participants. It is necessary to ensure that information about war crimes and human rights violations is preserved.

Media organisations also face physical risks, cybersecurity threats, and legal threats linked to their online activities. Frequent trolling campaigns are designed to manipulate algorithmic systems, causing them to automatically remove or suppress content. Trusted Partner programmes exist, but are unevenly and not transparently deployed. A systematic approach is lacking, with issues often addressed on a case-by-case basis.

Navigating these challenges requires a collaborative effort between social media platforms, media organisations, and local actors to establish clear protocols and support for fair and effective content moderation practices during crisis. Local communities recognised by international organisations can provide recommendations and validation that media organisations observing ethical and professional standards.

Lessons learned from existing initiatives

There are numerous mechanisms and initiatives in place to deal with content moderating in crises and emergencies. Many of them are focused on countering hate speech and terrorist content online, like the Christchurch Call, which was launched by New Zealand and France; Terrorist Content Analytics Platform’s (T-CAP) crisis protocol policy employs an alert system, archives live streams and footage, and trains AI to detect violent terrorist content; the Global Internet Forum to Counter Terrorism (GIFCT) was an initiative founded by tech platforms to prevent terrorists and violent extremists from exploiting digital platforms. Although the focus of these initiatives is narrow and targets a specific type of content, it is still difficult to evaluate their effectiveness. There is a lack of information on their goals, metrics, and benchmarks for success. Claims of success are often based solely on quantitative data, such as the number of takedowns or actions taken against content, provided by companies without access to independent review or audit. The focus on quantitative measures raises questions about the logic behind these mechanisms and whether they are truly effective in addressing the underlying issues.

There are also crisis response plans in place for social media platforms such as Twitter, Google, YouTube, and Meta. Another initiative from the private sector is the Global Alliance for Responsible Media, which focuses on advertising next to problematic content. From the civil society side, Access Now’s Digital Security Helpline, which provides rapid-response emergency assistance for online attacks (for more details please consult Annex I).

Discussion

Emergency and crisis responses

Crises can be defined in various ways, including natural disasters. The distinction between a continuing war or an invasion is that they present peaks in communication issues at different times.

Issues arising from crises and emergencies are multifaceted, involving various actors in different locations who will influence how information is shared and how people, organisations, and governments share and respond to it.

Many existing initiatives have focused on the content side and automated moderation that aims to take the burden off of individuals to review and deal with every single piece of content. Trusted flaggers/partners programs exist to leverage the expertise from the local context.

To effectively address crises, companies should have a cross-functional team that can work across issues and areas, and are able to work both externally and internally, with the ability to make decisions on both content issues and future directions a crisis may take.

Additionally, in a situation of crisis, the platform's perspective is to understand who the key actors are, where the information comes from, and mitigate state-aligned information operations since there is little that individual journalists or media outlets can do to mitigate against these. By understanding key actors and sources of information, companies can better respond to crises and ensure that valuable information is preserved.

A long-term framework is necessary to address crises, including protocols for the initial crisis and learning from the initial implementation. They need to look at the systemic challenges of individual companies, what is already working, where are the problems.

Identification of credible and trusted journalism actors

Sometimes, the regulation of hate speech and illegal content on platforms goes beyond the capacity of industry mechanisms because these issues stem from structural problems in society such as misogyny, homophobia, or racism.

To address this, the Ethical Journalism Network (EJN) recommends ethical audits to media outlets as a self-assessment procedure to evaluate their ethical standards and governance. Smaller media outlets such as fact-checking, non-profit, and younger newsrooms are easier to work with, especially in relation to the self-assessment program promoted by EJN. However, certifying "content" can be challenging. Therefore, the approach is to focus on the media outlet itself and its practices.

During a crisis, it is difficult for media outlets to help formulate frameworks or mechanisms. To address this, Reporters Without Borders (RSF) and the Journalism Trust Initiative (JTI) propose defining a framework in the early stages of a crisis or even before a crisis. All stakeholders involved should agree on the adoption of a ready-to-use framework as soon as media outlets are in a crisis situation. A network of nominating sources, such as national press freedom organizations or councils, could be used to nominate media outlets that should be encompassed in this emergency protocol.

Establishing a voluntary multistakeholder mechanism

The Chilling, ICFJ and UNESCO report reveals that online violence, which operates at the intersection of disinformation and other forms of hate speech, can be incredibly damaging due to its chilling effect. Among its recommendations, the study advocates for the development of an early warning system with a multistakeholder approach. This mechanism should be integrated into the UN Plan of Action on Safety of Journalists and its national plans of action, as the problem directly affects press freedom and the safety of journalists.

The report also calls for the creation of a system that helps to predict, monitor, and ultimately prevent the escalation of online violence to offline harm. Shockingly, 20% of the women journalists interviewed for the report experienced offline attacks, abuse, and harassment that they believed had been seeded online.

Platforms are comfortable with outsourcing risk assessments and monitoring of online violence escalation and targeting of journalists to civil society organizations. However, this poses many challenges and is not always efficient. Platforms should be asked to commit and invest more resources in this area.

The non-responsiveness of platforms is a problem globally. Data and cases need to be systematically gathered to showcase the scale and depth of the problem.

Governments need to be part of the conversation and should create regulatory frameworks to address this issue. The mechanism must be data and evidence-driven, especially to support advocacy on social media accountability and facilitate systematic, sustainable collaboration between civil society and tech platforms.

Background information

The Global Forum for Media Development (GFMD), is launching a multistakeholder mechanism Tech and Journalism Crisis and Emergency Mechanism (T&JM), to establish an escalation channel between tech platforms and independent and public interest media, and community and investigative journalism organisations. After the first workshop with local partners in Riga, GFMD and UCLA’s Institute for Technology, Law and Policy co-organised a consultation during UNESCO’s Internet for Trust Conference in Paris in February 2023. The consultation sought views from a wider range of partners on how the mechanism should be operationalized, how different actors should be involved, and what the expectations of such a mechanism were.

Based on the findings from the Riga workshop, the consultation emphasised the importance of local news is and a need to protect local news organisations and journalists. They often lack a channel to communicate with platforms when their content or accounts are blocked or removed due to algorithmic decisions, false flagging, trolling or cyber-attacked. In the past, GFMD has been hearing about these same issues from members and partners on a regular basis, particularly in countries with small markets that do not receive a lot of commercial interest from the platforms. Local voices are finding it increasingly difficult to operate in an environment that can often be hostile to smaller players.

Annex I: Existing crisis/emergency response initiatives

Annex II: Existing initiatives for identification of trusted journalism actors

Annex III: Agenda of the Side-Event and participants list

Tech and Journalism Crisis and Emergency Mechanism: Side-Event UNESCO Conference 2023 - 21st February 9:00h

Contributions by:

  • Olga Myrovych, Chief Executive Officer, Lviv Media Forum, Ukraine

  • Tetiana Avdeieiva, Project Manager on AI, Digital Security Lab, Ukraine

  • Courtney Radsch, Fellow UCLA, Institute for Technology, Law & Policy, US

  • Stephen Turner, former Director of EU Public Policy at Twitter, Belgium

  • Danica Ilic, Program Specialist, Ethical Journalism Network (EJN), UK

  • Thibaut Bruttin, Assistant Director General, Reporters Without Borders (RSF), France

  • Julie Posetti, Deputy Vice President and Global Director of Research, International Center for Journalists (ICFJ), US

  • Ruth Kronenburg, Executive Director, Free Press Unlimited (FPU), Netherlands

Attendees (Organisations)

  1. ARTICLE 19

  2. BBC Media Action

  3. Cafeyn

  4. Cardiff University

  5. Centro de Estudios en Libertad de Expresión y Acceso a la Información (CELE)

  6. Center for Media and Information Literacy

  7. Center for Law and Democracy

  8. CREOpoint

  9. Délégation permanente du Royaume du Maroc auprès de l'UNESCO

  10. Digital Security Lab

  11. EDRi

  12. Ethical Journalism Network

  13. European Endowment for Democracy

  14. Free Press Unlimited

  15. Ghana News Agency

  16. Global Forum for Media Development

  17. Global Media Registry

  18. Global Partners Digital

  19. Google

  20. Islamic World Educational, Scientific and Cultural Organization (ICESCO)

  21. International Center for Journalists (ICFJ)

  22. International Fund for Public Interest Media (IFPIM)

  23. International Media Support

  24. Institute for Rebooting Social Media, BKC, Harvard

  25. International Press Institute

  26. Internews

  27. Kaalmo legal Aid Center

  28. Lviv Media Forum

  29. Maharat Foundation

  30. Media Council of Malawi

  31. METÏS INNODEV SAS

  32. Ministry of Information, Publicity and Broadcasting Services (Zimbawe)

  33. Organization for Security and Co-operation in Europe (OSCE)

  34. Open Society Foundation (OSF)

  35. Meta Oversight Board

  36. Permanent Delegation of Türkiye to UNESCO

  37. RNW Media

  38. Reporters Without Borders (RSF)

  39. Surfshark

  40. Tech 4 Peace

  41. Thomson Reuters Foundation

  42. Youth Committee on UNESCO Media and Information Alliance

Last updated