The Robotics Roundup is a weekly newspost going over some of the most exciting developments in robotics over the past week.
In today’s edition we have:
- ‘World’s most advanced’ humanoid robot draws a cat and leaves people shocked
- A robotic raspberry teaches machines how to pick fruit
- Nobody Here But Us Fish
- Robots say they won’t steal jobs, rebel against humans
- Training robots how to learn, make decisions on the fly
A humanoid robot named Ameca, known for her ability to speak multiple languages, has amazed tech enthusiasts with her talent for completing request-based tasks. In a video by Engineered Arts, Ameca is given a pen and asked to draw a “cute-looking cat.” Within a minute, the robot produces a super-accurate cat drawing on a canvas.
Researchers have created a physical twin that mimics the feel of real raspberry plants, allowing for robots to be trained on raspberry picking in any season, without the need for actual raspberries to pick. Initial field demonstrations achieved an 80% success rate without modifications on the lab-trained robot. The physical twin can also be used to evaluate the sim2real transfer and improve the realism of simulations. The robotic system used for raspberry harvesting includes a mobile manipulator with a 4WD mobile base, a 6-degree-of-freedom robotic arm, and a custom-made gripper. The physical twin concept extends the idea of digital twins to the physical domain and reduces costs and the reliance on field experimentation.
Engineers at ETH Zurich have developed an AI-powered robotic fish named Belle to study marine life in a minimally invasive way. Belle is designed to be silent, look like a real fish, and swim naturally to collect data without disrupting marine creatures or habitats. Equipped with a high-resolution camera and the ability to collect environmental DNA samples, Belle can autonomously operate for about two hours before resurfacing to transmit data. This innovative approach aims to gain a better understanding of oceanic environments, which are crucial for regulating climate, sustaining ecosystems, and supporting livelihoods worldwide.
During the ‘AI for Good’ conference in Geneva, humanoid robots expressed their expectation to increase in number and help solve global problems without stealing humans’ jobs or rebelling against them. While some robots believed in the benefits of regulation, others emphasized the need for limitless exploration and opportunities. The robots showcased their upgraded generative AI capabilities and surprised their inventors with their sophisticated responses to questions. Overall, the robots recognized the potential for collaboration between humans and AI to create an effective synergy and make a positive impact on the world.
Researchers at the University of Illinois Urbana-Champaign have developed a learning-based method to enable robots on extraterrestrial bodies, such as moons orbiting Saturn or Jupiter, to make decisions autonomously about how and where to scoop up terrain samples. The method allows robots to quickly adapt and learn to scoop on different materials and changing landscapes. The team created a deep Gaussian process model trained on a large-scale dataset of 6,700 points of knowledge representing 67 different terrains. The model outperformed non-adaptive methods and other state-of-the-art meta-learning methods, using vision and minimal online experience. The researchers hope this method will aid in the exploration of ocean worlds, where limited information and short battery lifespan make autonomous decision-making crucial.