This paper introduces the hardware and software design of robots developed by PKU-SHRC Team for RoboCup 2010, Singapore. Each robot consists of 25 actuated degrees of freedom based on Dynamixel RX28, and RX64 servos. The whole system runs on a powerful ARM and DSP platform at a high speed. Software design combines the technology of computer vision, speech recognition, pattern recognition, and mechanical control. The robots can detect and track each single object in the playfield by vision. They can localize themselves accurately by machine learning algorithm. They can communicate with each other through natural language by speech recognition. They can also do planning and inference by artificial intelligence technology. What we want to do is to make the robot more like a human, not only from its appearance and mechanical design, but also from its innate intelligence.
Keywords: robot, vision, speech, artificial intelligence
This academic paper features our DYNAMIXEL RX-28 and RX-64 all-in-one smart actuators
All credit goes to: Guangnan Ye, Xibin Chen, Kaihua Jiang, Caifu Hong, Xu Wang, Wenqian Zhang, Guangcheng Zhang, and Xihong Wu from the Speech and Hearing Research Center, Key Laboratory of Machine Perception, Peking University, China