Skip to the content.

Embodied AI in Action: Physics-Aware Humanoid Robots for Real-World Interaction

Project ID: 2531ad1515

(You will need this ID for your application)

Research Theme: Artificial Intelligence and Robotics

UCL Lead department: Computer Science

Department Website

Lead Supervisor: Chengxu Zhou

Project Summary:

Why this research is important? As humanoid robots become increasingly capable, there is a need for them to understand not only what objects are but also how to interact with them physically. Current robots can recognise a box but may not understand how to open it, lift it, or move it safely. This research will address this gap by developing robots that understand object physics, enabling them to perform complex, whole-body tasks, such as lifting, carrying, or placing items with precision. This capability is crucial for robots to be truly useful in human environments, from hospitals and homes to warehouses and factories.

Who you will be working with? You will join a team of robotics experts at UCL Computer Science, known for pioneering work in embodied AI and humanoid robotics. Our team collaborates with leading researchers and industries to push the boundaries of robotics, especially in designing systems that integrate perception, learning, and physical interaction.

What you will be doing? You’ll be at the forefront of developing a physics-based model for humanoid robots, enabling them to learn how to interact with objects safely and effectively. Using force sensors and tactile feedback, you will help robots understand object properties like weight and compliance and use this knowledge to perform whole-body loco-manipulation tasks. Your work will involve hands-on experiments, model training, and testing with advanced humanoid robot platforms.

Who we are looking for? We’re seeking a motivated individual with a strong background in machine learning for robotics. If you’re excited about creating the next generation of intelligent robots that interact safely and intuitively with the world, we’d love to hear from you!