The researchers have already successfully taught their bots how to place objects in a space with correct orientation to their function and in relationship to other objects (see video above), but this work didn’t go far enough. It’s great that it can place a laptop right side up, but it’s not so great if the robot places it on top of a bookshelf where a human couldn’t reach it. So the next step was to teach the robots how to orient objects to the way humans actually use them. It learnt a set of “human poses”, and then it hallucinates humans posing all over the room.
After observing, the robots really started to get it down. Humans sit on the couch, facing the TV, which faces the couch. The remote control is within arm’s reach, not on the floor by their feet. When combining human orientation with the previous object-to-object orientation work they had done, they found that accuracy of the placement had been boosted to 86 per cent, which is pretty impressive for a bunch of tripping robots.
The work certainly still has a ways to go, but this is major progress. It could eventually lead to a bot that could not only pick up after you, but rearrange a room in a more ergonomic way. Speaking as a walking/talking organisational nightmare, the days of hallucinating robots can’t come soon enough, as long as they don’t just sit around watching the walls breathe and pulse. [Cornell University Chronicle via Kurzweil AI]