Meet ANYmal, a four-legged dog-like robot designed by researchers at ETH Zürich in Switzerland, in hopes of using such robots for search-and-rescue on building sites or disaster areas, among other applications. Now ANYmal has been upgraded to perform rudimentary parkour moves, aka “free running.” Human parkour enthusiasts are known for their remarkably agile, acrobatic feats, and while ANYmal can’t match those, the robot successfully jumped across gaps, climbed up and down large obstacles, and crouched low to maneuver under an obstacle, according to a recent paper published in the journal Science Robotics.
The ETH Zürich team introduced ANYmal’s original approach to reinforcement learning back in 2019 and enhanced its proprioception (the ability to sense movement, action, and location) three years later. Just last year, the team showcased a trio of customized ANYmal robots, tested in environments as close to the harsh lunar and Martian terrain as possible. As previously reported, robots capable of walking could assist future rovers and mitigate the risk of damage from sharp edges or loss of traction in loose regolith. Every robot had a lidar sensor. but they were each specialized for particular functions and still flexible enough to cover for each other—if one glitches, the others can take over its tasks.
For instance, the Scout model’s main objective was to survey its surroundings using RGB cameras. This robot also used another imager to map regions and objects of interest using filters that let through different areas of the light spectrum. The Scientist model had the advantage of an arm featuring a MIRA (Metrohm Instant Raman Analyzer) and a MICRO (microscopic imager). The MIRA was able to identify chemicals in materials found on the surface of the demonstration area based on how they scattered light, while the MICRO on its wrist imaged them up close. The Hybrid was more of a generalist, helping out the Scout and the Scientist with measurements of scientific targets such as boulders and craters.
Read 9 remaining paragraphs | Comments