AI Act now: 
Implementing the EU AI regulation

How to successfully implement AI
 in your business - Part 5

02. August 2024
Sebastian Bluhm
A blue picture with green elements. You can see machines, tubes, cables and documents, some of which are stamped.

Before it gets easier, it's usually hard. This is no different for sport than it is for the introduction of AI. Because before AI automates processes, speeds them up and makes your life considerably easier, there are a few things to consider.

 

The AI Act, the world's first comprehensive AI law, which recently came into force, adds another challenge. Namely the question: Is your AI legally compliant? The new AI Act presents companies with considerable challenges. Among other things, the minimum standards it prescribes make it necessary not only to know exactly which technologies are used and to categorise their purpose, but also to ensure appropriate transparency and quality.

 

But beware! In this blog post you can expect neither pity nor comfort. Instead, you will receive what you really need: Information that will help you to set up and rebuild your AI systems in line with EU requirements. We're convinced that the AI Act is a huge opportunity for companies. What it is and how you can seize it is the subject of this blog post.

AI Act key takeaways

It's a done deal: with 523 votes in favour, 46 against and 49 abstentions, MEPs approved the world's first comprehensive AI law. What this means for companies and their AI systems, however, is still unclear to many decision-makers. This is because from now on, AI must not only function in accordance with your business objectives; it must also fulfil legal requirements. In addition to the already complex consideration of strategic, mathematical, technical and economic issues, there is now an added dimension: the legal one. A well-considered and strategic approach to the introduction and use of artificial intelligence is therefore more important and more challenging than ever.

What managers should know

Existing AI systems must be adapted or rebuilt in accordance with the AI Act. New AI models will have to take the new regulations into account during development. So there is a lot to do. But what exactly? Information and knowledge in all relevant areas are key to the successful implementation of the AI Act. Below you will find an overview of the most important facts.

What is the AI Act?

The EU AI Act is the first comprehensive AI law in the world. It aims to ensure safety in the handling, use and development of artificial intelligence, protect fundamental rights and promote innovation.

When does the AI Act apply?

The law came into force on 01.08.2024. The various provisions will now apply for 24-36 months. This so-called ‘transitional phase’ is intended to establish official structures and allow companies to adapt to the new rules.

Who does the AI Act apply to?

As a rule of thumb, anyone in the EU who develops or uses AI for business purposes - whether their own or someone else's - is affected by the AI Act. This means that even if you use AI software from other providers in your company, the AI Act applies to you. But be careful: What does AI actually mean here?

What is considered AI?

What counts as AI is a question of definition. And the AI Act has its very own definition: within the regulation, artificial intelligence is described as software that generates results such as content, predictions, recommendations or decisions. It can be based on machine learning or deep learning, but also on statistical approaches, Bayesian estimation, search and optimisation methods. This is a rather broad definition, especially considering that companies have been leveraging statistics for much longer than just recently.

So beware: Even software/systems that you have not installed and/or developed under the label of artificial intelligence could fall under this category. If you're not exactly sure which technologies you're using, a thorough analysis in the near future will be unavoidable.

Your new obligations

The AI Act includes rules for the development, provision and use of AI systems, categorised according to risk level. The higher the risk potential, the more comprehensive the regulation. In plain language, this means stricter rules for high-risk systems, less strict rules for low-risk systems and bans on certain applications. The regulation also requires minimum standards for e.g:

  • risk management
  • data quality and data governance
  • documentation
  • transparency
  • possibilities for human monitoring
  • logging
  • accuracy
  • robustness
  • cybersecurity

The evaluation of the risk level of an AI use case is not carried out by an external test centre, but is your responsibility. This so-called ‘conformity assessment’ must be carried out with care. This is because the proof you have to provide that the AI used complies with the new regulations must stand up to any subsequent external audit. The evaluation also provides you with a solid working hypothesis for deriving future obligations and other tasks.

Censorship or opportunity? Both!

As different as the tasks of adapting to the regulations will be for companies - the starting position is the same for everyone. The AI Act is the starting signal for a new race. And a delayed start could prove expensive. Fines of up to EUR 35,000,000 or up to 7 % of the total global annual turnover of the previous financial year can be imposed on companies that fail to comply with the requirements of the AI Act. In the long term, however, the missed opportunity to differentiate yourself from the competition through early compliance would be more decisive.

How to safeguard your investment

If you have invested in artificial intelligence in recent years, you have already created company values, competitive advantages and automated processes that are affected by the AI Act. This makes it all the more important to gain clarity at an early stage about the extent to which the new regulations could jeopardise these investments. It's better to assess and implement the necessary adjustments today rather than wait until tomorrow. This will allow you to maintain your competitive edge and your company values in the long term.

It pays to be quick

Momentum is also a decisive factor here. Who in your industry will be the first to claim that they are using or have developed EU-compliant, transparent, reliable and secure AI? CE labelling for your AI system creates trust and separates the wheat from the chaff. Use the opportunity that the AI law offers you. Lead the way alone and visibly instead of following invisibly in the crowd.

What you can do now

  • Get smart: Know your new rights and obligations!
  • Check existing systems: Be aware that some existing systems also fall under the new definition of AI and may need to be revised or even replaced under the AI Act. 
  • Determine the level of risk: Evaluate your AI models according to risk level and define a working thesis to derive upcoming tasks.
  • Plan measures: Initiate adjustments to systems, processes, governance and responsibilities at an early stage.
  • Get active: Compliance measures must be implemented and adhered to. Your data must be relevant, representative, accurate and complete and must not contain any (even unintentional) bias.
  • Seize opportunities: Show yourself to be a modern, innovative and responsible company - right up to the top management level. CE labelling for your AI system creates trust and secures your pioneering role, both internally and externally.

Conclusion

Yes, the AI Act presents companies with new challenges. Companies need to know, understand and implement the AI regulation. Weighing up the various AI approaches will become even more complex in the future and will not be possible without a holistic view of all relevant factors. However, the AI law also offers many opportunities: time and knowledge will become decisive factors. The EU AI Act will be fully applicable in two to three years at the latest. It will then be too late for ‘quick adjustments’. Those who prepare early will gain a lot and risk nothing. The AI regulation is coming one way or another. Those who don't bury their heads in the sand now, but instead focus on the EU-compliant implementation of their AI projects, will gain a lot as pioneers.

AI Act Audit

The AI Act has also presented us with a new challenge: Because if you want to set up or rebuild your AI in line with EU requirements, you need both: technical expertise and legal advice. But where to get it ...? At PLAN D, we have therefore developed a new consulting service that offers companies exactly what they need to implement the AI Act: Legal certainty and technical implementation from a single source.

In cooperation with Taylor Wessing Germany, the leading law firm for IT and data protection law, we are therefore bundling our expertise into a unique AI Act Audit. In a two-stage process, we guide you through the maze of regulations, analyse AI applications and translate legal requirements into specific individual to-do lists - ready for EU-compliant implementation.

Our webinar with Taylor Wessing

In the webinar on 20 June 2024 (in German), Fritz Pieper from Taylor Wessing and Sebastian Bluhm answer each other's most pressing questions. What is legally considered AI? What does the AI Act mean for my AI project plan? Who is liable: service provider or client? Find out here.

Want to know more?