Op-ed by Leicht and Fox: Why Germany Needs an AI Security Institute

Philip Fox, 17 March 2025

 

Update 05/2025: In its coalition agreement, the new Federal Government commits itself to AI safety and applied resilience research. A German AISI would directly address these commitments, as Philip Fox, Anton Leicht and Max Negele lay out in a policy brief.


Germany should establish an AI Security Institute (AISI) modeled on international predecessors, say KIRA policy analysts Anton Leicht and Philip Fox. Their proposal appeared today in Tagesspiegel Background, a daily briefing widely read in German policy circles.

An AISI would be the Federal Government’s main point of contact for understanding and reacting to national security threats from AI, such as AI-enabled cyberattacks on critical infrastructure. The research & advisory institute would have three core functions:

  1. Early monitoring of key AI trends and their policy implications

  2. Advising the government on questions about AI and national security

  3. Technical research on AI-enabled threats to national security

Several countries have founded AISIs in response to rapid AI progress, including the US, the UK, France, Japan, Canada and India. The UK in particular is leading by example, having set up a world-leading AISI within months. Its organizational blueprint – €60m annual budget, entrepreneurial leadership, competitive salaries to attract international top-talent  – could serve as a role model for Germany.

It is vital for Germany to build sovereign state capacity to respond to AI-related national security threats. In the past, Germany has proven through its federal innovation agency SPRIND that it can establish successful, fast-moving public institutions to address government needs. As AI increasingly impacts national security and sovereignty, the government should build on this precedent by creating a German AISI.

Weiter
Weiter

International AI Safety Report to inform global policy discussions