Robotic hands and arms may be getting more sophisticated, but they don’t really rival what we think C3-PO would have poking out of his torso. That is until now: an European science team have been busy creating the Sensopac robotic limb, and it’s arguably the most human-like robotic limb yet. And partly that’s because its sophistication is derived from software modelled on the human cerebellum. The arm has artificial skin that can sense force and direction in detail, and its 38 motors mimic the structure of human muscles and tendons to give it a very human-like grip.
Those motors are arranged in opposing pairs, along with non-linear spring systems so that they mimic the opposing muscle structure that gives human hands their dexterity. The team achieved this by making hundreds of MRI scans of real hands in different positions. And apparently this has paid off, since it means the Sensopac hand can snap its fingers, pick up an egg or carry a cup of liquid much like we do.
The really clever bit, though, is in the artificial intelligence that controls the limb. In humans the cerebellum controls sensation and movement, so the team have created a neural-net system that mimics it to control the arm in a more “natural” manner. It’s apparently the “first neural-network-based controller that can control the dynamics of a robotic system in its full operational range,” and means that the arm (when perfected) would be able to pick up a cup, sense what the contents feels like and handle it appropriately.
The arm is now in advanced testing, but it’ll be a while before robots have a limb that behaves exactly like ours do: that’s “still light-years away” according to the project coordinator. Doesn’t stop me wondering how long it’ll be until there’s an entire neural-net robot droid built with this biomimetic tech though. And then there’s personality downloads to think about… [ICTResults via Physorg]