It’s usually easy for our human brains to predict how any given car, pedestrian or cyclist is going to act, but computers must be programmed to “understand” all of our varying behaviours on the road. The latest thing perplexing Google’s self-driving cars (and thereby entertaining us)? A simple track stand, according to Washington Post’s Matt McFarland.
In July, Google brought its self-driving car testing program to Austin, which holds new challenges for the prototype autonomous vehicle. For example, Austin’s population of cyclists, both fixed gear and otherwise, which seems like it may be larger than that of Mountain View, California. Before I go on, let me pause to allow you to get all of your damn-hipster jokes out of your system. Feel better? Let’s continue!
McFarland writes of a recent account by a cyclist in Austin, who had a funny interaction with a Google car at a four-way stop. The rider doesn’t say whether he was riding a fixed-gear bike specifically, but he rolled up to a stop sign and started track standing while he waited for the Google car to pass. He explains:
it apparently detected my presence (it’s covered in Go-Pros) and stayed stationary for several seconds. it finally began to proceed, but as it did, I rolled forward an inch while still standing. the car immediately stopped…
I continued to stand, it continued to stay stopped. then as it began to move again, I had to rock the bike to maintain balance. it stopped abruptly.
we repeated this little dance for about 2 full minutes and the car never made it past the middle of the intersection. the two guys inside were laughing and punching stuff into a laptop, I guess trying to modify some code to ‘teach’ the car something about how to deal with the situation.
the odd thing is that even tho it was a bit of a CF, I felt safer dealing with a self-driving car than a human-operated one.
It makes sense. If Google’s cars perceive a human, car or cyclist in motion, they automatically stop. A track stand, where a cyclist balances on his or her pedals while moving ever so slightly to keep upright, is a nebulous and confusing activity for a computer’s rigid understanding of human behaviour.
The tidbit highlighted above is also a glimpse into how the testing program uses actual humans to “teach” the car to understand the countless unexpected things it might discover on the road. At having humans pop out of bags in the middle of the street.
It sounds as though those testing scenarios didn’t include track stands, so now Google will need to “teach” its autonomous algorithm to understand a new facet of human culture.
Picture: AP Photo/Tony Avelar