Skip to main content
RISE logo

Are your AI systems ready for Europe’s new AI regulation?

It looks as though many seemingly simple algorithms will be classified as high-risk systems in the new EU artificial intelligence legislation. All organisations that develop data-driven services therefore need to investigate how the new rules affect them – and they will most likely need to CE-mark their systems.

The AI Act is being negotiated and deliberated in the EU legislative machine. When the regulations come into force, tentatively in 2025, all AI systems subsequently introduced will be covered, as well as existing systems that are updated or substantially modified.

Similarly to how the General Data Protection Regulation (GDPR) affected all industries, the AI Act is expected to impose stricter requirements on procedures and processes to ensure the quality of high-risk AI systems.

“Products you are drafting and designing today will come out on the market when the AI Act has come into force,” says Håkan Burden, Senior Researcher in AI and IT systems at RISE. “Much more than expected will then be classified as AI systems. It’s no longer about what you as a manufacturer consider to be an AI system. That’s no longer relevant. It’s the Act that will define high-risk systems and AI technologies.”

Technical and legal consequences

Håkan Burden and his colleague at RISE, Susanne Stenberg, legal expert in new technology, have reviewed the proposal for the regulation. They see many technical and legal consequences for companies and operators.

“For example, the rules affect certain businesses and certain products, they serve to both safeguard individual rights and ensure product safety,” explains Stenberg.

In short, a large number of businesses and operations are identified in which AI systems pose a high risk, among them critical infrastructure, education, law enforcement, management processes, and employment. In practice, these systems may be algorithms that, for example, calculate compensation when parents take off work to care for a sick child, the right to student aid, creditworthiness checks, or smart services in mobility and energy storage.

AI systems in some physical products are also classified as high risk. This applies to systems that are part of the safety system in machines, medical equipment, cable cars, and so on.

“If you have a system that controls a machine to ensure it does not drive over somebody, this can be classified as a safety system,” says Burden. “Another example is the windscreen wiper system in mining machines – the safety function is classified as an AI system. Or a laser gate. A multitude of product operations become high-risk operations.”

It’s no longer about what you as a manufacturer consider to be an AI system

“Pick-up-sticks of regulations”

When it comes to AI in products, the field is not particularly easy to navigate. Vehicles, forestry machinery, and ships – along with associated equipment – are exempt from the regulation, and for the time being are regulated by UN agencies. The practical consequences for companies that supply both EU-regulated machines and UN-regulated vehicles need to be investigated.

“There will be different thresholds internally in an organisation in terms of what this means specifically in the business,” says Stenberg.

“It will be like a pick-up-sticks of regulations,” says Burden. “Which stick is best to pick up? Here at RISE we can help determine that.”

Together they form a team focused on research into software, law, and policy development, and are therefore uniquely positioned to understand the new AI Act. Both researchers also see that RISE’s other business areas can have several important roles to play as a third party in verifying that test results and data quality meet the requirements for CE marking the AI systems.

“RISE can serve as a bridge that understands the perspectives of authorities as well as product developers. We can also analyse how EU digitalisation regulations affect and interact; following the AI Act, the Data Governance Act (data management) and the Data Act (data access) are next in line to be adopted,” says Susanne Stenberg.

Susanne Stenberg

Contact person

Susanne Stenberg

Senior Forskare/Rättslig expert

+46 73 398 73 41

Read more about Susanne

Contact Susanne
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

* Mandatory By submitting the form, RISE will process your personal data.

Contact person

Håkan Burden

Senior forskare

+46 73 052 02 29

Read more about Håkan

Contact Håkan
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

* Mandatory By submitting the form, RISE will process your personal data.