Key takeaways
In Ukraine and other conflict zones, journalism and media organizations face unique challenges when it comes to content moderation on social media platforms. Social media platforms seem to lack capacity for effectively and promptly dealing with content moderation challenges and media organisations frequently experience delays and a lack of follow-up from platforms after flagging moderation issues. By the time they make decisions, news items become outdated.
Some platforms remove graphic content related to war and potential violations of human rights. Media outlets often practice self-censorship to avoid being banned or to prevent their accounts from being suspended. There is a need to ensure that content about war crimes and human rights violations is not lost.
Navigating these challenges requires a collaborative effort between social media platforms, media organisations, and local actors to establish clear protocols and support for fair and effective content moderation practices. By understanding key actors and sources of information, companies can better respond to crises and ensure that valuable information is preserved.
To effectively address crises, companies should have a cross-functional team that can be work across issues and areas, and are able to work both externally and internally, with the ability to make decisions on both content issues and future directions a crisis may take.
Certifying "content" can be challenging. Therefore, the recommended approach is to focus on the account level, e.g. the media outlet itself and its practices. Ethical Journalism Network (EJN) and Journalism Trust Initiative (JTI) recommend ethical audits to media outlets as a self-assessment procedure to evaluate their ethical standards and governance. They also propose defining a ready-to-use framework as soon as media outlets are in a crisis situation (or even better, in advance). A network of nominating sources, such as national press freedom organizations or councils, should be used to nominate media outlets that should be encompassed in this emergency protocol. However, there was also acknowledgement that crises come in many different shapes and sizes, including prolonged, slow burning ones, so that should be addressed in future iterations,
Participants advocated the development of an early warning system with a multistakeholder approach. This mechanism should be integrated into the UN Plan of Action on Safety of Journalists and its national plans of action, as the problem directly affects press freedom and the safety of journalists.
The mechanism must be data and evidence-driven, especially to support advocacy on social media accountability and facilitate systematic, sustainable collaboration between civil society and tech platforms.
Issues identified
In Ukraine and other conflict zones, journalism and media organisations face unique challenges when it comes to content moderation on social media platforms. The Lviv Media Forum has experienced delays and a lack of follow-up from platforms after flagging moderation issues, indicating a need for clear mechanisms of cooperation and involvement of local actors in responding to content moderation issues.
Some platforms prioritise creating a "joyful, happy environment," leading to the removal of graphic content related to war and potential violations of human rights. While platforms lack the capacity to promptly and effectively deal with content moderation challenges, media outlets often practise self-censorship to avoid being banned or to prevent their accounts from being suspended. Evidence preservation and speech suppression were mentioned by several participants. It is necessary to ensure that information about war crimes and human rights violations is preserved.
Media organisations also face physical risks, cybersecurity threats, and legal threats linked to their online activities. Frequent trolling campaigns are designed to manipulate algorithmic systems, causing them to automatically remove or suppress content. Trusted Partner programmes exist, but are unevenly and not transparently deployed. A systematic approach is lacking, with issues often addressed on a case-by-case basis.
Navigating these challenges requires a collaborative effort between social media platforms, media organisations, and local actors to establish clear protocols and support for fair and effective content moderation practices during crisis. Local communities recognised by international organisations can provide recommendations and validation that media organisations observing ethical and professional standards.
Lessons learned from existing initiatives
Discussion
Emergency and crisis responses
Crises can be defined in various ways, including natural disasters. The distinction between a continuing war or an invasion is that they present peaks in communication issues at different times.
Issues arising from crises and emergencies are multifaceted, involving various actors in different locations who will influence how information is shared and how people, organisations, and governments share and respond to it.
Many existing initiatives have focused on the content side and automated moderation that aims to take the burden off of individuals to review and deal with every single piece of content. Trusted flaggers/partners programs exist to leverage the expertise from the local context.
To effectively address crises, companies should have a cross-functional team that can work across issues and areas, and are able to work both externally and internally, with the ability to make decisions on both content issues and future directions a crisis may take.
A long-term framework is necessary to address crises, including protocols for the initial crisis and learning from the initial implementation. They need to look at the systemic challenges of individual companies, what is already working, where are the problems.
Identification of credible and trusted journalism actors
Sometimes, the regulation of hate speech and illegal content on platforms goes beyond the capacity of industry mechanisms because these issues stem from structural problems in society such as misogyny, homophobia, or racism.
Establishing a voluntary multistakeholder mechanism
The report also calls for the creation of a system that helps to predict, monitor, and ultimately prevent the escalation of online violence to offline harm. Shockingly, 20% of the women journalists interviewed for the report experienced offline attacks, abuse, and harassment that they believed had been seeded online.
Platforms are comfortable with outsourcing risk assessments and monitoring of online violence escalation and targeting of journalists to civil society organizations. However, this poses many challenges and is not always efficient. Platforms should be asked to commit and invest more resources in this area.
The non-responsiveness of platforms is a problem globally. Data and cases need to be systematically gathered to showcase the scale and depth of the problem.
Governments need to be part of the conversation and should create regulatory frameworks to address this issue. The mechanism must be data and evidence-driven, especially to support advocacy on social media accountability and facilitate systematic, sustainable collaboration between civil society and tech platforms.
Based on the findings from the Riga workshop, the consultation emphasised the importance of local news is and a need to protect local news organisations and journalists. They often lack a channel to communicate with platforms when their content or accounts are blocked or removed due to algorithmic decisions, false flagging, trolling or cyber-attacked. In the past, GFMD has been hearing about these same issues from members and partners on a regular basis, particularly in countries with small markets that do not receive a lot of commercial interest from the platforms. Local voices are finding it increasingly difficult to operate in an environment that can often be hostile to smaller players.
Annex I: Existing crisis/emergency response initiatives
The IGF is a global multistakeholder platform that facilitates the discussion of public policy issues pertaining to the Internet. Whereas it is not a crisis/emergency response platform per se, it is a key place to discuss and debate policies in relation to the digital environment.
Private Sector, Academia, CSOs
Global Network Initiative aims to protect and advance freedom of expression and privacy rights in the ICT industry by setting a global standard for responsible company decision making and serving as a multistakeholder voice in the face of government restrictions and demands.
The Global Internet Forum to Counter Terrorism (GIFCT) is an NGO designed to prevent terrorists and violent extremists from exploiting digital platforms. Founded by Facebook, Microsoft, Twitter, and YouTube in 2017, the Forum was established to foster technical collaboration among member companies, advance relevant research, and share knowledge with smaller platforms.
Initiative started by New Zealand and France, set pledges for companies to sign up and for civil society to sign up as supporters through the Advisory Network.
United Nations / Governments
Tech Against Terrorism is an initiative launched and supported by the United Nations Counter Terrorism Executive Directorate (UN CTED) working with the global tech industry to tackle terrorist use of the internet whilst respecting human rights. With support from Public Safety Canada, Tech Against Terrorism launched the Terrorist Content Analytics Platform (TCAP) to prevent the spread of violent terrorist content and flagging it to content moderation teams.
Law enforcement units that are able to refer content to platforms when this violates their terms of services. Little oversight or availability to audit the impact of those referrals.
Crisis Protocols by social media platforms
Twitter: crisis misinformation policy in the context of armed conflict
Google: Crisis response focused on natural disasters, relying on AI driven predictive modelling
Youtube: crisis resource panel where they partner with verified service partners in specific countries or regions. Additionally, the panel has an open forum that anyone can submit through.
Initiative through the World Economic Forum, association with advertisers: set of sixteen categories of problematic content where advertisers do not want their ads to show up next to.
Annex II: Existing initiatives for identification of trusted journalism actors
Internews, WAN-IFRA, PubMatic, Group m
Initiative of United for News, a non-profit coalition of global media industry and international brands
International standard for showcasing and promoting trustworthy journalism
Coalition of journalists, editors, press owners and media support
Journalism and technology tool that rates the credibility of news and information websites
International consortium of news organizations building standards of transparency
Tool that rates news outlets based on the "probability of disinformation
Machine-readable text file that news publishers add to their websites to signal their affiliations with other trusted news organizations
Annex III: Agenda of the Side-Event and participants list
Tech and Journalism Crisis and Emergency Mechanism: Side-Event UNESCO Conference 2023 - 21st February 9:00h
Registration and refreshments
Welcome and introduction of the T&JM initiative
Mira Milosevic (GFMD)
Courtney Radsch and Michael Karanicolas (ITLP)
Experiences of journalism and media organisations in Ukraine and the region
Crisis and emergency protocols, escalation channels, and mechanisms for identification and verification
What are the lessons learned from recognised international crisis protocols and processes?
What are existing avenues of collaboration with platforms and other private sector stakeholders?
What are the expectations for crisis/emergency protocols and communication and escalation channels? What are the possible actions that platforms could take with the mechanism in place during a crisis or emergency?
Processes and criteria for identification of credible and trusted journalism actors online
What external references, such as news integrity and trust initiatives, professional and ethical self-regulation bodies, and donor and funder audits, could be used to identify credible and trusted journalism actors online?
What should be the minimum requirements for demonstrating adherence to professional and ethical journalistic standards?
Key elements of establishing a voluntary multistakeholder mechanism
What stakeholders and communities will be brought together to establish this mechanism? What should the level and scope of participation be for different types of organisations?
How can local leadership and ownership be ensured?
What role could international organisations play in data collection, monitoring, and advocacy with tech platforms?
How can risk assessment, and transparency be best integrated?
Next steps: conclusions and recommendations
Contributions by:
Olga Myrovych, Chief Executive Officer, Lviv Media Forum, Ukraine
Tetiana Avdeieiva, Project Manager on AI, Digital Security Lab, Ukraine
Courtney Radsch, Fellow UCLA, Institute for Technology, Law & Policy, US
Stephen Turner, former Director of EU Public Policy at Twitter, Belgium
Danica Ilic, Program Specialist, Ethical Journalism Network (EJN), UK
Thibaut Bruttin, Assistant Director General, Reporters Without Borders (RSF), France
Julie Posetti, Deputy Vice President and Global Director of Research, International Center for Journalists (ICFJ), US
Ruth Kronenburg, Executive Director, Free Press Unlimited (FPU), Netherlands
Attendees (Organisations)
Centro de Estudios en Libertad de Expresión y Acceso a la Información (CELE)
Center for Media and Information Literacy
Center for Law and Democracy
Délégation permanente du Royaume du Maroc auprès de l'UNESCO
Ethical Journalism Network
European Endowment for Democracy
Global Forum for Media Development
Islamic World Educational, Scientific and Cultural Organization (ICESCO)
International Center for Journalists (ICFJ)
International Fund for Public Interest Media (IFPIM)
International Media Support
Institute for Rebooting Social Media, BKC, Harvard
International Press Institute
Ministry of Information, Publicity and Broadcasting Services (Zimbawe)
Organization for Security and Co-operation in Europe (OSCE)
Open Society Foundation (OSF)
Permanent Delegation of Türkiye to UNESCO
Reporters Without Borders (RSF)
Thomson Reuters Foundation
Youth Committee on UNESCO Media and Information Alliance