.webp)
European start-ups building defence-grade AI face a regulatory environment designed for a different era. Here's what the path to procurement actually looks like — and what's blocking it.
Somewhere in Europe right now, a team of engineers is building an AI system that could genuinely improve European defence capability. They have a working prototype, early customers, and a clear path to deployment. They also have a legal department that has spent six months trying to work out whether their product is covered by the EU AI Act, the Defence Industrial Strategy, export control regulations, or all three simultaneously.
This is not an edge case. It is the standard experience of European deep tech companies building at the intersection of AI and defence.
Dual-use technology — systems with both civilian and military applications — sits in a regulatory no-man's land that was designed before AI became a meaningful factor in defence capability. Export control frameworks designed for physical hardware struggle to apply to software systems. The EU AI Act's carve-outs for national security applications create ambiguity rather than clarity. And procurement frameworks designed for large defence primes are systematically inaccessible to start-ups operating at the speed the technology requires.
"We build AI systems that are used in both civilian and defence contexts — that is not a problem to be solved, it is the commercial reality of where the best technology comes from. What needs to change is the regulatory and procurement infrastructure around that reality. European industry is ready. The frameworks are not."
Ajay Charkavarthy — Chief AI Officer, Thales UK
Systems used exclusively for national security purposes fall outside the Act's scope — but this exemption is narrower than it sounds. Any system with a civilian use case, operated by a civilian agency, or deployed in a context where it could affect civilians is likely within scope. For dual-use AI companies, this means almost everything they build requires compliance assessment. The good news: the Act creates a clearer framework than the pre-2025 patchwork. The challenge: compliance is resource-intensive and the guidance is still evolving.
The EU's dual-use regulation covers the export of certain AI technologies to third countries. Applying it to software-defined AI systems requires interpretation that often has to be sought from national authorities, creating significant delays and uncertainty. Several European start-ups have effectively been blocked from international partnerships by export control ambiguity that has taken 12–18 months to resolve.
NATO's DIANA and the European Defence Fund were both designed to improve start-up access to defence procurement. Progress has been real but uneven. The application and approval cycles remain long relative to commercial timelines, and the contract structures often create IP and data ownership challenges that are difficult for early-stage companies to accept.
"NATO's approach to autonomous systems must balance operational necessity with clear ethical and legal frameworks. We are not against speed — we are for accountability. The Alliance needs to move faster on capability adoption, and it needs to do so with governance structures that member states can trust."
Irene Benito Rodriguez — Co-Chair of NATO's Autonomy Task Force
Despite the challenges, the window for European AI companies in defence is opening faster than it has at any previous point. Three structural factors are driving this:
Defence budget expansion — European NATO members are collectively moving toward a 3% of GDP defence spending target. A significant share is explicitly earmarked for technology and AI. The addressable market is larger than it has ever been.
Institutional urgency — The combination of operational lessons from Ukraine and political pressure to increase European defence self-sufficiency has created genuine appetite at senior institutional levels to accelerate AI adoption.
The Innovation Partner model — Events and programmes specifically designed to connect start-ups with institutional buyers and procurement officials are proliferating. The AI in Defence Summit's Innovation Partner programme structures this directly: demonstrations with institutional end-users, capital introductions, and access to procurement conversations that matter.
Founders building defence AI in Europe consistently identify the same gap: it is not technology or market access that constrains them. It is the combination of regulatory navigation and institutional relationship-building that most start-ups do not have the resources to do in parallel with building a product.
The companies that break through are typically those that have found a senior institutional champion — someone inside a ministry, agency, or prime contractor who understands the capability and has the authority to move things forward. That relationship is rarely found through a cold procurement process. It is found in the right rooms, in the conversations that happen at the margins of structured agendas.
The AIDEF27 Innovation Partner programme connects European deep tech ventures with institutional buyers and strategic investors. Limited to 10 companies. Apply at aidefencesummit.eu.