blog
November 3, 2025
CE marking software under the EU AI Act – Who needs it and how to prepare a conformity assessment

From 2026, AI systems classified as high-risk under the EU Artificial Intelligence Act (Regulation (EU) 2024/1689) will have to undergo a conformity assessment and obtain a CE marking before being placed on the EU market or put into service. For many software vendors—especially those operating in finance, healthcare, HR, or public services—this introduces product-safety obligations that until now applied mainly to physical goods.
Does software really need CE marking under the AI Act?
Short answer is “yes” — if the software falls within the Act’s high-risk scope.

High-risk AI examples (Annex III): credit scoring, CV screening, fraud detection, medical diagnosis, critical infrastructure control, education grading, law enforcement risk assessment.

This is clearly defined for providers, importers, and distributors in Articles 16–25 of the AI Act , and deployers (incl. Arts. 26 & 29):

This is outlined in Article 43 and Annex VII:

General CE-mark placement rules come from Regulation (EC) No 765/2008, Art. 30 and Decision 768/2008/EC, Annex II.
Post-market monitoring ([Art. 61]), incident reporting ([Art. 62]), and market surveillance under Regulation (EU) 2019/1020 apply. Additionally, providers must store technical documentation for 10 years and co-operate with authorities ([Art. 17 & 61(5)]).
With the AI Act enforcement phasing in from 2026–2027, development and compliance should run in parallel now. CE marking delays can mean product launch suspension and fines up to €35 million or 7 % of global turnover ([Art. 99]).
Let’s compare how EU and UK approach these issues, based on UK Government Policy Paper: A Pro-Innovation Approach to AI Regulation (2024) and Introduction to AI assurance.

Before an AI system can legally carry the CE mark, developers and providers must work through a series of concrete steps defined in the AI Act — from risk classification to technical documentation and final declaration of conformity.
Blocshop helps organisations plan, build and safely integrate AI into their software systems by:
→ Book a free consultation with Blocshop to assess your AI system’s compliance readiness.Developer checklist for AI
Learn more from our insights
The journey to your
custom software
solution starts here.
Services
blog
November 3, 2025
CE marking software under the EU AI Act – Who needs it and how to prepare a conformity assessment

From 2026, AI systems classified as high-risk under the EU Artificial Intelligence Act (Regulation (EU) 2024/1689) will have to undergo a conformity assessment and obtain a CE marking before being placed on the EU market or put into service. For many software vendors—especially those operating in finance, healthcare, HR, or public services—this introduces product-safety obligations that until now applied mainly to physical goods.
Does software really need CE marking under the AI Act?
Short answer is “yes” — if the software falls within the Act’s high-risk scope.

High-risk AI examples (Annex III): credit scoring, CV screening, fraud detection, medical diagnosis, critical infrastructure control, education grading, law enforcement risk assessment.

This is clearly defined for providers, importers, and distributors in Articles 16–25 of the AI Act , and deployers (incl. Arts. 26 & 29):

This is outlined in Article 43 and Annex VII:

General CE-mark placement rules come from Regulation (EC) No 765/2008, Art. 30 and Decision 768/2008/EC, Annex II.
Post-market monitoring ([Art. 61]), incident reporting ([Art. 62]), and market surveillance under Regulation (EU) 2019/1020 apply. Additionally, providers must store technical documentation for 10 years and co-operate with authorities ([Art. 17 & 61(5)]).
With the AI Act enforcement phasing in from 2026–2027, development and compliance should run in parallel now. CE marking delays can mean product launch suspension and fines up to €35 million or 7 % of global turnover ([Art. 99]).
Let’s compare how EU and UK approach these issues, based on UK Government Policy Paper: A Pro-Innovation Approach to AI Regulation (2024) and Introduction to AI assurance.

Before an AI system can legally carry the CE mark, developers and providers must work through a series of concrete steps defined in the AI Act — from risk classification to technical documentation and final declaration of conformity.
Blocshop helps organisations plan, build and safely integrate AI into their software systems by:
→ Book a free consultation with Blocshop to assess your AI system’s compliance readiness.Developer checklist for AI
Learn more from our insights
Let's talk!
The journey to your
custom software
solution starts here.
Services
Head Office
Revoluční 1
110 00, Prague Czech Republic
hello@blocshop.io
blog
November 3, 2025
CE marking software under the EU AI Act – Who needs it and how to prepare a conformity assessment

From 2026, AI systems classified as high-risk under the EU Artificial Intelligence Act (Regulation (EU) 2024/1689) will have to undergo a conformity assessment and obtain a CE marking before being placed on the EU market or put into service. For many software vendors—especially those operating in finance, healthcare, HR, or public services—this introduces product-safety obligations that until now applied mainly to physical goods.
Does software really need CE marking under the AI Act?
Short answer is “yes” — if the software falls within the Act’s high-risk scope.

High-risk AI examples (Annex III): credit scoring, CV screening, fraud detection, medical diagnosis, critical infrastructure control, education grading, law enforcement risk assessment.

This is clearly defined for providers, importers, and distributors in Articles 16–25 of the AI Act , and deployers (incl. Arts. 26 & 29):

This is outlined in Article 43 and Annex VII:

General CE-mark placement rules come from Regulation (EC) No 765/2008, Art. 30 and Decision 768/2008/EC, Annex II.
Post-market monitoring ([Art. 61]), incident reporting ([Art. 62]), and market surveillance under Regulation (EU) 2019/1020 apply. Additionally, providers must store technical documentation for 10 years and co-operate with authorities ([Art. 17 & 61(5)]).
With the AI Act enforcement phasing in from 2026–2027, development and compliance should run in parallel now. CE marking delays can mean product launch suspension and fines up to €35 million or 7 % of global turnover ([Art. 99]).
Let’s compare how EU and UK approach these issues, based on UK Government Policy Paper: A Pro-Innovation Approach to AI Regulation (2024) and Introduction to AI assurance.

Before an AI system can legally carry the CE mark, developers and providers must work through a series of concrete steps defined in the AI Act — from risk classification to technical documentation and final declaration of conformity.
Blocshop helps organisations plan, build and safely integrate AI into their software systems by:
→ Book a free consultation with Blocshop to assess your AI system’s compliance readiness.Developer checklist for AI
Learn more from our insights
Let's talk!
The journey to your
custom software solution starts here.
Services