It sounds like heresy, but I couldn’t help wondering that after seeing the rows and rows of physical buttons, gauges, and readouts that Russian technicians use to run this nuclear power plant. Built post-Chernobyl, the plant boasts a “30km-wide security zone around the plant itself, filled with all sorts of sensors and monitoring devices that measure the condition of the environment to report any smallest deviation from normal radiation doses.” But what about all those clunky, straight-outta-Star-Trek knobs and lights — what if they’re a safety feature, too? The plant was completed in 1990, so colour computer screens, mice, and “normal” high-tech user interfaces were certainly available. How could a dizzyingly dense grid of plastic buttons be better than a single screen that can change to display only what the technician needs at any given moment?
Well, here’s the thing, as Christopher Mims at Technology Review brilliantly points out: touch is a powerful, powerful thing. And not the sterile, featureless version that passes for “touch” on your iPad. I’m talking about the physical, primal, ultra-high-res sensorium that you experience from interacting with everyday objects in the real world. Our brains and hands evolved they way they did for a reason, and virtual displays and interfaces simply don’t “click” with the kind of infomation-processing we’ve evolved to do so well. Deep, spatial sense-memory – “coloured THING in THAT location that feels like THIS and STAYS there” – is how our savannah-dwelling ancestors navigated their environment and avoided getting killed, and it’s still true today.
So while there were surely other factors influencing the Russian nuke-designers’ UI philosophy (economics being a huge one), it’s hard not to wonder if they shied away from screens on purpose. When you’re always one wrong command away from another Chernobyl, it pays to play by the brain’s built-in rules – not Steve Jobs’s.