AI law: High-risk AI systems need more nuance

0

EBU

The EBU supports the aims of the AI ​​law. The risk-based approach adopted by the European Commission and the graduated levels of obligations seem well suited to regulate such a complex area.

Public service media use AI systems to deliver information and services to citizens. As part of the EBU-led digital news project “A European Perspective”, EBU Members are using AI-enabled recommendation and translation tools (The Peach and EuroVox) to provide real-time information across borders and in the local language. These tools are extensively overseen and reviewed by editorial teams. This is just one example of the innovative use of AI by public service media

As part of the AI ​​Act, the EBU and its members have identified areas of concern in the category of high-risk AI systems. The broad scope of this category could threaten the legitimate use of AI systems in the media sector. Specifically, our concerns are:

  1. AI systems intended to produce complex text or to generate or manipulate image, audio or video content should not be classified as high risk by default
    In the case of public service media, these systems are rigorously reviewed by editors, who also take responsibility for the results produced.
  2. The systematic classification as high risk of all AI systems intended to be used for biometric categorization and identification of natural persons is too broad
    This classification risks benign AI systems developed or used by the media falling into this category. Public service media may face excessive load on systems that: suggest metadata to tags for archival purposes; use facial/voice recognition to attribute content to celebrities/politicians; or identify monuments.
  3. The protection of the right to freedom of expression and the right to freedom of the arts and sciences must be maintained and clarified
    As many audiovisual productions use AI for visual effects, the application of transparency rules must be flexible enough not to harm the user experience. Regulation must take into account the wide range of possible situations, for example the fact that viewers may start watching audiovisual content halfway through (making it difficult to identify the first interaction or exposure).
Share.

Comments are closed.