Robotic Real-Time Human Tracking

back to our lab portfolio
Robotics / Technology

At BAÜP, we are interested in robotics that does more than execute predefined commands. In one of our recent lab experiments, we explored real-time human tracking with a KUKA iisy robot, combining camera perception, AI-based face tracking, custom inverse kinematics and interactive software tools into a single responsive system.

The goal was to move away from traditional robot programming and towards a more adaptive model of behaviour: a robot able to detect a person, process that information live and generate motion accordingly, without depending on a fixed sequence prepared beforehand.

To achieve this, we developed a multi-layer architecture that connected visual sensing, tracking, motion logic and robotic control in real time. A custom inverse kinematics solver built in-house played a central role, giving us direct control over how movement was resolved from live perception data. Together with AI tracking libraries and tools such as TouchDesigner, the system became a testbed for a broader idea we care deeply about: robotics as a responsive, perception-driven medium.

This is one of the directions shaping our Lab — the development of robotic systems that can sense, decide and move in continuous relation to human presence.