In a first, Oregon State University says its bipedal robot Cassie managed to go from the couch to a 5K in 53 minutes. It’s an impressive feat of machine learning and robotics engineering, but as far as 5Ks go, I wouldn’t worry about having to outrun killer humanoid robots just yet.
Cassie has been around for a while and is the brainchild of Agility Robotics — a spinoff of OSU. The significant thing here isn’t so much the speed in which Cassie completed the 5K. The average able-bodied human, for instance, can walk a 5K in a similar amount of time or less, and most beginner runners complete that distance in 30-40 minutes. (Even in OSU’s video, you can see folks keeping up with Cassie while walking.) The impressive thing is Cassie was able to “run” that far untethered and on a single charge.
Running is, biomechanically speaking, pretty complex. There are whole sites dedicated to dissecting running gaits, but most people just run instinctively. Recreating that ability in robots — especially bipedal robots — is difficult as it requires something called “dynamic balancing”, or the ability to shift positions in motion without toppling over. The team says Cassie was able to teach itself how to make subtle adjustments while moving thanks to a “deep reinforcement learning algorithm.”
“The Dynamic Robotics Laboratory students in the OSU College of Engineering combined expertise from biomechanics and existing robot control approaches with new machine learning tools,” said Jonathan Hurst, an OSU robotics professor and co-founder of Agility Robotics. “This type of holistic approach will enable animal-like levels of performance. It’s incredibly exciting.”
However, balance is especially challenging for bipedal robots compared to their quadrupedal cousins. Boston Dynamics’ Cheetah robot, for example, can run at 45 km/h, while Unitree’s Go1 robodog can run alongside someone at a moderate 5.6-6mph speed. These four-legged bots have been able to run at high speeds for quite some time now, though not necessarily for long distances on their own. Meanwhile, Cassie’s 53-minute runtime included about 6.5 minutes where the bot’s computer overheated and another instance where it went too fast on a turn and fell over. The main argument for bipedal bots, however, is that theoretically, they can more easily slot into everyday life.
OSU envisions bipedal robots will one day handle logistics work, such as package deliveries, but also help people in their homes. “In the not very distant future, everyone will see and interact with robots in many places in their everyday lives, robots that work alongside us and improve our quality of life,” said Hurst. To that end, Cassie was also recently able to use machine learning to teach itself to walk up and down stairs without LIDAR or cameras.
While that’s the goal for the future, you won’t be seeing bipedal robots out and about anytime soon. At least, not outside university campuses. There are still plenty of mechanical and engineering challenges these bots have to overcome before they can really be a type of consumer gadget. (See: the several companion robots that died ignominious deaths in 2018.) But in the meantime, can somebody please do something about the nightmare appearance of these highly advanced robots? Ain’t nobody inviting these things into their homes looking like that.