NMC Review of CoE Guidelines on Responsible AI Implementation in Journalism

///NMC Review of CoE Guidelines on Responsible AI Implementation in Journalism

NMC Review of CoE Guidelines on Responsible AI Implementation in Journalism

In the rapidly evolving landscape of artificial intelligence (AI) and its application within journalism, ensuring that these technologies are used responsibly is more crucial than ever. Recognizing this, the Council of Europe’s (CoE) Steering Committee on Media and Information Society (CDMSI) adopted comprehensive guidelines on November 2023. These guidelines aim to provide practical advice for policymakers, technology providers, platforms, media professionals, and other stakeholders on implementing and critically evaluating AI systems in journalism.

Following the publication and adoption of the Draft Council of Europe Guidlines on responsible use of AI in Journalism, the report is now the the hands of the Implementation Unit of the CoE which will liaise with other specialised CoE departments and committees and member states.

The Core Principle: Freedom of Expression

The CoE’s guidelines are rooted in the fundamental right to freedom of expression as enshrined in Article 10 of the Convention for the Protection of Human Rights and Fundamental Freedoms. This principle underscores the vital role of media and journalism in democratic societies. The guidelines highlight the need for a balanced approach that upholds these freedoms while addressing the ethical, practical, and societal implications of AI in newsrooms.

Key Recommendations for News Organizations

News organizations are at the forefront of adopting AI technologies to enhance their reporting and operational efficiencies. The CoE guidelines provide a robust framework to help these organizations navigate the complexities of AI integration:

  • Alignment with Organizational Mission: Newsrooms must align their use of AI with their core mission and values, ensuring that AI tools support their journalistic objectives rather than compromising them.

“Understand what the organizational mission is and how it aligns with the use of AI.”

  • Accountability and Oversight: It’s crucial to establish clear accountability for AI use. Given the potential editorial impact of AI tools, the guidelines suggest that editors-in-chief hold responsibility for AI applications in the newsroom.

“Decide who has accountability for the way in which AI is used (as the guidelines view AI as an editorial tool it suggests that the editor-in-chief has accountability).”

  • Training and Risk Assessment: Continuous training for staff and systematic risk assessments are essential to recognize and mitigate the risks associated with AI. This includes understanding AI’s limitations and ensuring its use does not lead to biased or erroneous outputs, especially in sensitive areas such as election reporting.

“Ensure that all staff are AI trained.”

“Conduct systematic risk assessments to recognize, assess and mitigate risks of AI use.”

  • Transparency and Inclusiveness: Decisions around AI use should be transparent and inclusive, allowing for diverse perspectives and ensuring that stakeholders understand how AI affects editorial processes and outputs.

“Ensure that decisions around the use of AI are transparent and inclusive i.e. that everyone has the right and ability to express different interests and perspectives.”

  • Compliance and Evaluation: News organizations must rigorously evaluate AI systems for compliance with privacy and data protection rules. They should not rely solely on technology providers but conduct their assessments to ensure data quality and counter biases.

“Conduct rigorous evaluations of data availability and quality as well as compliance with privacy and data protection rules in order to counter biases, stereotypes and harmful differentiations.”

  • Ethical Use and Public Engagement: AI should be used in accordance with journalistic ethics, maintaining traditional values such as fairness, accuracy, and objectivity. News organizations should also foster public debate about AI’s role in society.

“Use AI in accordance with the ethics of journalism, aligned with professional codes, and in a way that does not impinge upon the human rights of others.”

“Encourage public debate about the use of AI in society.”

Responsibilities for AI Providers

AI providers play a crucial role in supporting news organizations’ use of these technologies. The guidelines call on AI providers to:

  • Support Editorial Autonomy: AI tools should respect the editorial independence of news organizations and be designed to enhance journalistic practices without imposing external biases.

“Respect, editorial autonomy and news media independence.”

  • Provide Transparency and Guidance: Providers must be transparent about the AI models and data they use and offer practical guidance to newsrooms, particularly as journalists may lack technical expertise.

“Be transparent about the models and data they use – must provide news organizations with adequate information to facilitate risk assessments.”

  • Facilitate Newsroom Adaptation: By making models and training data accessible, AI providers can help newsrooms develop their AI capabilities and ensure they are well-prepared for the operational changes these technologies bring.

“Make certain models, training data, and other resources available to newsrooms to develop journalistic AI systems (where commercially feasible).”

Guidelines for Dissemination Platforms

News dissemination platforms, such as social media and content aggregators, are also addressed in the guidelines. They are urged to:

  • Ensure Access and Trust: Platforms should ensure that news content remains universally available, easy to find, and recognized as a trusted source by the public.

“Develop internal governance responses to ensure that content is universally available, easily to find, and recognized as a source of trusted information by the public.”

  • Maintain Editorial Standards: Platforms should avoid interfering with news content and respect the editorial standards of news organizations, ensuring that their algorithms do not distort or manipulate the distribution of news.

“Avoid interference with news content and refrain from overwriting editorial standards.”

  • Collaborate to Combat Misinformation: Effective collaboration with news media, civil society, and fact-checkers is essential to tackle disinformation and promote reliable information.

“Collaborate with the news media, civil society, and other relevant stakeholders like fact-checkers in tackling dis-/misinformation.”

Role of Governments

Governments have a pivotal role in creating an environment where responsible AI use in journalism can thrive. The guidelines recommend that governments:

  • Support Media Pluralism and Innovation: By diversifying funding schemes and investing in digital media innovation, particularly for small and local media, governments can support the sustainable development of responsible AI systems.

“Create favorable conditions for the realization of human rights and media pluralism, including diversification of funding schemes to support responsible AI systems and alternative digital tools.”

  • Encourage Transparency and Accountability: Governments should develop standards and encourage practices that enhance transparency and accountability in AI use, supporting independent regulatory and self-regulating bodies in the media sector.

“Develop guidelines and standards for responsible use of AI.”

Embracing the Future of Journalism with AI

As AI continues to transform the media landscape, these guidelines from the Council of Europe provide a vital roadmap for news organizations, technology providers, dissemination platforms, and governments alike. By fostering responsible AI use, the guidelines aim to uphold the values of freedom of expression and journalistic integrity, ensuring that the media can continue to serve its democratic and societal role effectively.

About the News Media Coalition (NMC):
The NMC is an international consortium of leading news agencies and media organizations dedicated to promoting and protecting the interests of Primary Source Journalism. For further details on these guidelines and the NMC’s stance, please visit NMC’s official website.

About the Council of Europe’s CDMSI:
The Steering Committee on Media and Information Society (CDMSI) is responsible for steering the Council of Europe’s work in the field of media and information society, ensuring that freedom of expression and media freedom are protected across Europe.

2024-06-20T14:56:53+00:00

About the Author: