
Embodied AI Center
Kiel University
VISION
Artificial Intelligence for the Physical World
Artificial intelligence is rapidly evolving from purely digital, text- and image-based systems toward embodied intelligence, known as Embodied AI, which perceives, acts, and learns in the physical world. This new phase of AI research is emerging through the convergence of robotics, machine perception, and large multimodal AI models.
The E-AI-C is intended to bring together the CAU’s existing expertise in surgical robotics, marine robotics, including in cooperation with GEOMAR, as well as humanoid and quadruped robotics within a shared structure.
By closely interlinking physical embodiment with the development of modern AI methods for robots, the center will lay the foundation for new scientific and technological breakthroughs, with far-reaching momentum for medical applications, marine research and marine infrastructures, and human-robot collaboration.

Objectives
Advance embodied learning in real-world environments
Develop AI systems that can perceive, reason, plan, and act continuously in dynamic physical settings, enabling robots to learn from interaction rather than from static data alone.
Support data-driven business transformation
Enable organizations to integrate intelligent embodied systems, automation, simulation, and AI-assisted decision-making into their workflows to improve efficiency, resilience, and competitiveness.
Develop coordinated multi-agent AI systems
Create teams of robots, software agents, and autonomous systems that can communicate, coordinate, and jointly solve complex tasks in dynamic real-world environments.
Enable safe, trustworthy, and domain-adaptive robotic autonomy
Build reliable autonomous systems for surgical, marine, humanoid, and quadruped robotics, with a strong focus on safety, robustness, human-robot collaboration, and transfer across application domains.
Develop multimodal Vision-Language-Action models for robotics
Create and evaluate AI models that connect language, visual perception, sensor data, and physical motion, allowing robots to understand instructions and translate them into safe, context-aware actions.

Selected Publications
-
A. R. Wagner, M. Balaji Rao, H. Wrede, S. Pirk, X. Xiao, Fire as a Service: Augmenting Robot Simulators with Thermally and Visually Accurate Fire Dynamics, ArXiv, 2026, [Preprint]
-
A. R. Wagner, M. Balaji Rao, X. Xiao, S. Pirk, Understanding Fire Through Thermal Radiation Fields for Mobile Robots, ArXiv, 2026, [Preprint]
-
S. Huber, K. Pelzer, D. Nquyen, X. Xiao, S. Pirk, HUMEMBR: Learning Human Routines for Predictive Embodied Navigation, under submission, 2026
-
...









