Tier 1: The Overview Copy link
State of Play Copy link
The headline: The EU moved early on AI regulation through the comprehensive AI Act, with the intention of gaining a first-mover economic advantage by setting the global regulatory “gold standard” for AI governance. The bloc is now seeking to balance the implementation of the AI Act with a renewed focus on improving the EU’s global competitiveness in AI and innovation.
The context: The EU has a thriving AI startup environment, due in part to its strong university and R&D culture. However, the risk-averse investment landscape means that scaling EU startups can be challenging, with US private AI investment levels now nearly eight-fold higher than those in the EU. To increase the EU’s competitiveness across the AI innovation chain, in February 2025 (as part of France’s AI Action Summit) the European Commission announced a new funding initiative, InvestAI, to mobilise €200 billion for AI investment from a combination of Member States and private sector partners. This includes a new fund of €20 billion to build state-of-the-art AI “gigafactories”, capable of powering the most advanced AI models.
The rules: The EU's AI Act establishes rules in every EU member state around the use, development, importing, distribution and manufacture of AI systems. Importantly, the AI Act will affect businesses both inside and outside the EU (see below). It sits alongside existing legislative frameworks such as those for data (GDPR) and consumer protection that capture some AI use cases, creating a complex web of regulatory requirements depending on the sector and product. Each EU country will be required to nominate a regulator (termed a "national competent authority") to monitor and enforce the implementation of the AI Act in their country.
What this means for founders Copy link
The AI Act’s broad ‘extra-territorial’ scope means that AI-enabled products sold, put into service, deployed, or the outputs of which are used in any EU market will need to be compliant, regardless of where the company is based and the models are trained. The majority of obligations under the AI Act apply to AI developers, with the most stringent requirements applying to AI systems that can be used in specified ‘high-risk’ use cases. The AI Act will be supplemented by guidance and ‘delegated acts’ from the EU Commission (see our full implementation timeline below), meaning that founders will need to constantly monitor the specific rules and enforcement deadlines that will affect them.
One aspect of the EU AI Act that people sometimes forget about, is that only a few "traditional" machine learning systems are considered as high-risk. There is an opportunity to demonstrate that your application isn’t impacting the health, safety or fundamental rights of the user - which means that you can be exempted from the strictest obligations. Copy link
Agata Hidalgo Copy link
European Affairs Lead, France Digitale
Forward Look Copy link
Startups stand to benefit from the EU’s competitiveness pivot. Copy link
AI innovation is now seen as a strategic imperative in a fragmenting geopolitical picture. Initiatives such as the above mentioned InvestAI and AI Factories - currently set up in Finland, Germany (HammerHAI and JAIF), Greece, Italy, Luxembourg, Spain, Sweden, Austria, Bulgaria, France, Poland, and Slovenia - which aim to increase the provision of EU-located computing and supercomputing resources, are specifically targeted at startups in order to create an innovation pipeline to grow future “European AI champions” and reduce the dominance of US tech firms in the EU market.
Other startup-friendly AI policies to watch during the 2024-2029 European Commission mandate include the introduction of a Cloud and AI Development Act to accelerate data centre construction and increase compute capacity; an Apply AI Strategy to enhance new industrial applications of AI and improve public services; the establishment of the European AI Research Council to consolidate AI R&D resources; and an updated Data Strategy to enable more extensive data sharing among businesses.
Digital regulation - including the AI Act - is set to be streamlined as part of the competitiveness drive. Copy link
Concerns have mounted that difficulties with navigating the EU’s complex web of overlapping regulation could dissuade SMEs in particular from starting and scaling in the EU. This is likely to include better clarification of how potentially overlapping legislation - such as the AI Act and the GDPR - interact with each other, and some simplification or harmonisation of the compliance obligations across digital legislation for startups and mid-caps.
Timelines Copy link
February 2025: AI practices deemed to hold "unacceptable risk" become prohibited, and general provisions (e.g., requirements on businesses relating to ‘AI literacy’) will apply.
May 2025: The finalised Codes of Practice expected to be published.
2 August 2025: Obligations on developers of ‘general-purpose’ AI systems will come into force, and the EU’s review of amendments to the list of prohibited AI practices will be completed.
2 August 2025: Member States must designate national competent authorities to monitor the implementation of the Act in each jurisdiction.
By June 2026: The deadline by which the Copyright Directive must be reviewed, which could see the legislation reopened to include provisions on the interaction between generative AI and copyright law.
2 August 2026: Obligations for AI systems deemed to be "high-risk" (not including those intended to be used as a safety component of a product) will come into force.
2 August 2026: Deadline for Member States to implement rules on penalties and to establish at least one operational AI regulatory sandbox.
2 August 2026: The AI Office will issue further guidance for providers and deployers on the obligations to inform users when they are iterating with an AI system on this date.
Q4 2025-Q1 2025: AI & Cloud Development Act expected to be introduced.
August 2027: Obligations go into effect for high-risk AI systems that are intended to be used as a safety component of a product.
By the end of 2030: Obligations go into effect for certain AI systems that are components of large-scale IT systems established by EU law in the areas of freedom, security and justice.