Hello, everyone,
I’m excited to introduce you to ROBOTIS AI WORKER, an open-source, ROS 2 based mobile humanoid robot that leverages teleoperation and transformer-based imitation learning to perform real-world manipulation tasks. We’re pushing the boundaries of Physical AI, making complex robot behaviors accessible through intuitive human demonstrations.
ROBOTIS AI WORKER, Imitation Learning Demo
This video showcases the imitation learning pipeline for AI WORKER — starting from teleoperation-based data collection to data visualization and finally Model inference.
Our current policy is based on ACT(Action Chunking with Transformers), enabling the robot to understand and execute high-level behaviors from demonstration data.
In this video, you’ll see:
- Teleoperation for collecting demonstration data
- Visualization of the collected data
- Model inference using a trained transformer-based model
We begin with a simple pick-and-place task, but more complex manipulation behaviors are on the way.
Stay tuned as we continue to push the boundaries of physical AI in real-world robotics.
This project is built with ROS 2 and is fully open source.
GitHub Repository:
GitHub - ROBOTIS-GIT/ai_worker: AI Worker: FFW (Freedom From Work)
GitHub - ROBOTIS-GIT/physical_ai_tools: physical_ai_tools
This project makes use of open-source resources from ROS and HuggingFace LeRobot
— many thanks for sharing such valuable tools!
“ROBOTIS AI Worker Demo Showcase”
#ROBOTIS #AIWORKER #ImitationLearning #Teleoperation #ROS2 #Transformers #ActionChunking #RobotLearning #PhysicalAI huggingface lerobot #OpenSource