The Chip That Can Smell Can Now Help Robots Feel Touch

The Chip That Can Smell Can Now Help Robots Feel Touch
Photo: Intel

Researchers from the National University of Singapore (NUS) have created an artificial robot skin that can process touch 1,000 times faster than humans can, according to Loihi, which Cornell University researchers were able to program to recognise the scent of hazardous chemicals a few months ago. Are these Loihi chips beginning to sound like the start of Star Trek’s Data to anyone else?

NUS researchers were able to program the robot skin to recognise the shape, texture, and hardness of objects 10 times faster than “the blink of the eye,” which also required the use of an event-based camera. Also, researchers taught a second, more human-like robotic hand outfitted with the skin to read Braille, which was translated by the Loihi chip with more than 92 per cent accuracy.

These researchers presented the results of their experiments at the Robotics: Science and Systems virtual conference last week. (Their research paper can be found here.) In their presentation video, they showed off the robot at the centre of their research. The event camera, which is an integral part of the robot’s ability to process touch, only picks up differences in illumination and each pixel asynchronously. Basically, information is only transferred from both the artificial skin and camera to the Loihi chip as needed, when both the camera and tactile sensors both pick up the same object at the same time.

One of the benefits of transferring information this way is lower latency, or the amount of time it takes for the information for the robot’s tactile sensors and camera to transfer information to the Loihi chip, researchers explained. It’s part of the reason why the robot arm was able to recognise objects via touch so fast. Using an event-based camera and tactile sensors also makes the system more accurate at recognising objects than visual data or tactical data alone would, according to the researchers ” about 10 per cent more accurate.

Intel noted in a blog post about the researchers’ findings that enabling a human-like sense of touch in robotics could allow automated robots in factories to “easily adapt to changes…using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping.” Such advances could also lead to safer human-robot interactions in care-giving professions or in an operating room.

Unlike other robot hands we’ve seen in the past (like the BionicSoftHand with its insane amount of fine motor skills), the one presented in the conference video has the artificial skin pulled over two plates that can squeeze together to pick up objects. It’s a far cry from the BionicSoftHand, but imagine these two types of robotic hands combined ” a robot hand that could not only sense the texture of, say, an orange, but also could roll it around in its hand, or maybe even toss it up in the air and catch it. And, of course, there is also Boston Dynamics’ robotic dog, Spot. Looks like we’re on track to see sophisticated robots that will eventually replace us all, which might be exactly what the world needs.