en

EU AI Act: What the Regulation Means for Your NGO

In many NGOs and associations, AI is already commonly used, for example, to create texts and images or to segment donor data. With the EU AI Act, clear rules have now been established for the use of AI that NGOs must also follow. Read here to learn what you need to keep in mind to meet the new requirements.

 

What is the EU AI Act?

The EU AI Act came into effect on August 1, 2024, marking a significant milestone in the regulation of artificial intelligence. It defines how AI may be used in the EU and the rules that must be followed.

The AI Act is based on a “risk-based approach.” This means that AI systems are classified into categories according to the risk they pose to people and society.

 

The categories are as follows:

  • Unacceptable Risk: Extremely high-risk AI applications are completely prohibited.
  • High Risk: Strict requirements and controls apply here, such as for AI applications in healthcare or education.
  • Low or Minimal Risk: These applications only need to meet a few requirements.

For NGOs, the categories of “low risk” and “high risk” are particularly relevant, as most AI systems they use fall into these categories.

 

Generative AI (e.g., ChatGPT)

Generative AI systems that create content may fall into different risk categories depending on their use. If an NGO uses ChatGPT to create content, it must assess whether the application potentially poses high risks, such as spreading misinformation. In such cases, transparency obligations and risk mitigation measures are required.

 

Predictive AI

Predictive AI systems that make forecasts can also be classified as high-risk, especially when they influence decisions that impact individuals, such as in healthcare or finance. NGOs that use such systems must ensure they comply with the AI Act’s requirements, including transparency, traceability, and regular reviews.

 

Transparency Obligation – What Does This Mean?

One of the most important changes for NGOs is the transparency obligation. The AI Act requires organizations to disclose when they use AI-generated content. This is intended to make it clear when an image, text, or prediction is generated not by humans but by a machine.

For instance, if an NGO wants to use an AI-generated image for a fundraising campaign, this must be indicated in the future, either with a watermark, caption, or notice in the text. A simple statement like “This image was created with AI” is sufficient to comply with the AI Act.

Additionally, it’s beneficial to explain the context of AI use, for example: “This image was generated with AI to visually support our message.” Potential limitations and misunderstandings should also be openly communicated. For photorealistic images, it is advisable to clarify that the image is not an actual photo but purely illustrative.

Even though most AI-generated images look noticeably different from real photos (like this one here), some image AIs can already create deceptively realistic pictures. The EU AI Act requires that AI-generated images be labeled as such in the future.

Risk Assessment and Documentation – What’s Involved?

Every NGO using AI should regularly assess the associated risks. This means:

 

Risk Assessment

Before implementing an AI system, the NGO should consider whether the system could disadvantage people in any way or lead to misunderstandings. For example, what is the risk of receiving misinformation due to outdated data?

 

Documentation

It’s essential to record all details of the AI system’s use – for example, how and why a particular AI system is used and what steps have been taken to minimize risks. This allows for transparency and traceability of the system’s operation and usage if needed.

 

Potential Penalties – What Happens in Case of Violations?

Violations of transparency obligations or other AI Act rules can be costly for NGOs. Fines of up to 15 million euros or up to 3% of global annual turnover can be imposed – whichever amount is higher.

 

What Should NGOs Do Now?

For NGOs already using AI or planning to do so in the future, it’s advisable to implement a few key practices now:

  • AI Generation Notice: Simple labeling of AI content – e.g., through a caption or note in the text.
  • Risk Assessment and Documentation: Regularly review the AI systems used and keep detailed records of their application.
  • Communication with Target Audiences: Communicate openly about why AI is used and the benefits it brings to the target audiences or organization.

By taking these steps, NGOs can stay compliant and avoid potential penalties. The AI Act may seem complex, but with transparency and clear communication, NGOs can meet its requirements while leveraging AI’s benefits for their work.

 

 

Read more here on AI for NGOs: AI for NGOs

save
to top