Illustration for AiMD and SaMD, showing a computer screen, a brain, a mobile device, a syringe and other elements linking technology to medical interests

Navigating the Labyrinth: The CE-Marking Challenges for AiMD and SaMD in the EU

The European Union’s Medical Device Regulation (EU MDR 2017/745)1 and the new Artificial Intelligence Act (AIA)2 introduced a seismic shift for medical device companies. For developers of Software as a Medical Device (SaMD) and, specifically, Artificial Intelligence Medical Devices (AiMD), obtaining the coveted CE mark has become significantly more complex.

Illustration for AiMD and SaMD, showing a computer screen, a brain, a mobile device, a syringe and other elements linking technology to medical interests

Here are the key hurdles digital health manufacturers must overcome to achieve and maintain CE compliance.

The Looming Presence of the EU AI Act

While the MDR is the law governing the CE mark, the forthcoming EU Artificial Intelligence (AI) Act adds an unprecedented new layer of regulatory complexity for high-risk AI applications—a category that includes most AiMD.

  • Dual Compliance: AiMD will likely need to comply with both the MDR (for safety and performance as a medical device) and the AI Act (for requirements related to data governance, transparency, robustness, and human oversight as a high-risk AI system).
  • Transparency and Explainability (XAI): The AI Act introduces strict requirements for transparency and human oversight, particularly for high-risk AI systems (Article 13). Manufacturers will need to implement measures for Explainable AI (XAI), allowing users to understand the model’s output and decision-making process, a technical challenge for complex deep learning models.

The Up-Classification Challenge

Under the previous Medical Device Directive (MDD), many low-risk SaMDs were Class I devices, allowing manufacturers to self-certify without the review of a Notified Body (NB).

  • The MDR Shift: The MDR introduced stricter, risk-based classification rules (Rule 11) for software. Given most SaMD/ AiMD are intended to support clinical decision making, most products will be at least Class IIa – therefore, requiring a Notified Body for Conformity Assessment. Manufacturers should also establish if the software functionality of a hardware device has its own medical purpose – thus, it is a medical device itself. Or does the software just drive/ influence the hardware (which may be considered an “accessory” under the MDR or In Vitro Diagnostic Regulation 2017/746 (IVDR))3.
  • The Consequence: Manufacturers previously operating under self-certification must now undergo rigorous third-party audits, significantly increasing time-to-market and compliance costs. Furthermore, identifying a Notified Body designated for software conformity assessments with available resources and reasonable timeframes can be difficult.

Dynamic Algorithms vs. Static Documentation

AiMDs, particularly those employing Machine Learning (ML), are designed to learn and adapt over time. This dynamic nature clashes with the MDR’s demand for static, controlled, and pre-market-approved documentation.

  • Algorithmic Modification: A core difficulty is demonstrating compliance when the algorithm’s performance changes after deployment (known as “algorithm drift”). Under MDR, any significant change or modification to the intended purpose or design may require a new UDI-DI and assessment by the Notified Body, resulting in a supplement to the CE-certificate or possibly requiring a new conformity assessment.
  • The Burden of Proof: Regulators need assurance that new versions or updates do not negatively impact patient safety or clinical performance. Manufacturers must implement a quality management system (QMS) which captures these modifications under a change control mechanism that clearly defines what constitutes a “significant” change requiring NB notification versus a “non-significant” change that can be self-declared. This emphasizes the importance of tailoring the QMS to the product and not simply using available templates – as those that work for static hardware products may not be effective for dynamic machine learning platforms.

Enhanced Clinical Evidence Requirements

The MDR significantly raised the bar for demonstrating clinical safety and performance, a major hurdle for novel AiMD.

  • Rigor of the CER: The Clinical Evaluation Report (CER) must now provide robust, scientifically sound evidence demonstrating the device’s intended clinical benefit and an acceptable benefit-risk ratio. Furthermore, claiming equivalence between artificial intelligence platforms is difficult under the stricter MDR rules but also because AiMD are normally novel and unique. Additionally, equivalence is challenging given AiMD technologies can lack transparency (e.g. machine learning black boxes) and, as mentioned above, can be dynamic. On the other hand, SaMD and AiMD can have streamlined clinical evaluations via retrospective data analysis.
  • Data Governance & Bias: For AiMD, this extends to the data used for training and validation. Requirements exist in both the MDR/IVDR and the new Artificial Intelligence Act for data used to train the algorithm. Manufacturers must demonstrate that the datasets are representative, unbiased, and of high quality4.

Cybersecurity as a General Safety and Performance Requirement (GSPR)

For any connected SaMD or AiMD, cybersecurity is a unique but fundamental requirement.

  • GSPR 17: Annex I of the MDR (GSPR 17) explicitly requires devices with programmable electronic systems to be designed to ensure data and system security. This includes protecting against unauthorized access, data breaches, and ensuring the integrity of the clinical data the software processes.
  • Artificial Intelligence Act (Article 15): This article states that high-risk devices must have cybersecurity in place not just for the software but also for associated assets (e.g. training data sets).
  • General Data Protection Regulation (GDPR): Many SaMD and AiMD also process personal data. Therefore, under the GDPR, this data collected must be kept to a minimum and protected accordingly. The GDPR places requirements on data processing additional to security and privacy – for example, there must always be a lawful basis (Article 6) and data subject rights respected (Chapter III)5. Furthermore, compliance to GDPR may apply from early development so could be relevant before the MDR CE-marking process.
  • Continuous Monitoring: Since software vulnerabilities are constantly discovered, manufacturers must implement a proactive system for monitoring, patching, and updating the device’s security throughout its entire lifecycle, managed through the Post-Market Surveillance (PMS) system.

Conclusion: Strategic Compliance is Key

The transition to EU MDR—compounded by the emergence of the AI Act—presents profound challenges for SaMD and AiMD developers. The path to CE marking is now longer, more resource-intensive, and demands a level of continuous compliance management far exceeding previous standards.

To succeed, manufacturers must adopt a proactive, strategy-first approach:

  1. Early Classification: Accurately classify the device early to identify all the applicable regulations and standards. Confirm whether Notified Body involvement is required and select the conformity assessment route which aligns with all applicable regulations (e.g. MDR and AIA).
  2. AI-Centric QMS: Integrate AI-specific requirements (like data management, algorithm change control, cybersecurity, etc) directly into the Quality Management System (QMS).
  3. Invest in CER: Prioritize rigorous, well-documented clinical validation and post-market clinical follow-up (PMCF) from day one. Retrospective data can be useful to reduce clinical development costs; however, the data must be representative of the final intended use population.

How can Lumis help?

Lumis International is a regulatory affairs consultancy specializing in supporting small and medium-sized medical device companies.

Our experts possess a unique strength in the rapidly evolving digital health sector, with an established niche in securing the CE-mark for Software as a Medical Device (SaMD) and Artificial Intelligence Medical Device (AiMD) products.

We provide both strategic and operational support to seamlessly streamline your route to compliance, accelerating your product’s market entry.

Do you want to CE-mark your digital health product? Contact us to discuss your project.

References

  1. Medical Devices Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices https://eur-lex.europa.eu/eli/reg/2017/745/oj/eng#anx_IX
  2. Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence. https://eur-lex.europa.eu/eli/reg/2024/1689/oj/eng
  3. MDCG 2019-11 Rev.1. Guidance on Qualification and Classification of Software in Regulation (EU) 2017/745 – MDR and Regulation (EU) 2017/746 – IVDR. October 2019. June 2025 rev.1. https://health.ec.europa.eu/document/download/b45335c5-1679-4c71-a91c-fc7a4d37f12b_en?filename=mdcg_2019_11_en.pdf
  4. MDCG  2025-6  Interplay between the Medical   Devices Regulation (MDR) & In vitro Diagnostic Medical Devices Regulation (IVDR) and the Artificial Intelligence Act (AIA). June 2025. https://health.ec.europa.eu/document/download/b78a17d7-e3cd-4943-851d-e02a2f22bbb4_en?filename=mdcg_2025-6_en.pdf
  5. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data. https://eur-lex.europa.eu/eli/reg/2016/679/oj/eng


From development to post-market: Lumis supports medical device manufacturers with regulatory strategy, MDR/IVDR submissions, and lifecycle compliance.

Explore medical device regulatory services


Liam Spencer

This post was written by Liam Spencer. Liam is Senior Project Manager, Regulatory Consulting and Services.His 8 years’ experience includes both pharmaceutical and medical device regulatory affairs. He has worked across different technologies, including; small chemical entities, ATMPs, biologics, vaccines, combination products, and class I-III devices. Working across all phases of development has allowed Liam to gather a broad understanding of regulatory affairs and, therefore, devise creative regulatory strategies.

Throughout his career, Liam has also worked on complex medical device international submissions across various countries and continents, providing a foundation to understanding global expansion from a regulatory perspective.

His strategic and project management skillset was developed at his previous position as a global regulatory lead for a large CRO (Contract Research Organisation). Liam has worked for companies of different sizes and backgrounds and, with his strong experience in a consultancy environment, he can tailor regulatory solutions to the client’s needs.