When buying gadgets, comparison is paramount. There are inevitably a hundred TVs that fit the general requirements you've set out, a few dozen Blu-ray players and a handful of smartphones. In many cases, it ends up being a process of elimination, and standardised gadget ratings would ensure that that process was a fair and informed one.
As our society comes to terms with the direness of our energy situation, and as the idea of "green" transforms from buzzy marketing bullshit to something that our gadgets actually have to be, it will be essential to have real, digestible data on how the electronics we use impact the environment. Some considerations here could include:
• Power consumption: how much gadgets use when they're plugged in and operating; how much they use when they're plugged in and not being used. • Materials: how environmentally friendly are the materials used in a product. • Supply chain: under what conditions were the products manufactured, and from what countries did their parts originate. • Durability: how many use cycles a product can be expected to last for. • Disposability: how long a product or its packaging will take to degrade in various situations.
Some terms and standards for addressing these issues are already floating around. "Vampire draw" is a more colourful way to talk about the power our gadgets quietly suck while they're plugged in but not in use, and since 1992, Energy Star has been giving consumers a vague notion that their products were using gobbling up a little less energy than they could be. But if you walked into a Best Buy and asked the people inside - the people buying things and the people selling them - what standards were required of any given product for it to bear the Energy Star sticker, how many of them would have any clue? Not very many, I imagine.
Green stats are just the start; similar standardised ratings could overhaul the way we evaluate all our devices' specs. Sure, many of the ones you might consider when buying a new gadget are objective: Megapixels. Processor speeds. Screen sizes. But why do we blindly trust the companies that make our gadgets to faithfully report things like battery life? Why do we have to rely on websites to run benchmarks for every new machine that comes out? Here are just a few things that could be tested by a third party:
• Battery life: standardised tests for various usage scenarios. For a music player this could mean playing straight through, on shuffle or selecting particular songs and scrubbing to a particular moment. • Benchmarks: tests for CPUs and GPUs. • Power on and shut down times: tests that would show how long various models take to turn on completely, shut down completely, go into a sleep state, wake up from a sleep state, etc. • Display: a standardised test for brightness, colour reproduction, etc. • Wireless reception: how strong of a signal devices get with Wi-Fi, Bluetooth, etc. • Noise: how loud larger products like desktops, appliances, etc, are while operating.
Things like stock specs and Energy Star standards are a start, but only that. Establishing standardised tests for aspects of performance and power consumption - and, perhaps, as the EPA has suggested for the auto industry, assigning a letter grade based on those numbers - would help keep consumers informed and companies honest.