That is, as the bard said, the question. Or he would have done if he was a mobile broadband consumer living right now. But why are we so concerned with whether or not a certain implementation of 4G is actually "4G"? It's not just a matter of being pedantic -- it's a matter of not confusing people. Quick question: When did high definition TV broadcasts start? No sneaky researching the answer; just get a rough guesstimate in your head for later.
Now, on with the show.
It happens every single time we write about Australian 4G here at Gizmodo, with Elly's review of the HTC Velocity 4G being the most recent.
We comment that Telstra's current LTE implementation doesn't meet the generally accepted understood designation for 4G, namely that it should be theoretically capable of 100Mbps downstream and 50Mbps upstream. An argument ensues as to what constitutes "real" 4G, people banter ITU specifications back and forth, and very little gets settled once and for all. Some folks are happy for 4G to mean plenty of things, others want a more specific definition.
Needless to say, at Gizmodo Australia, we're in the latter camp.
It's simpler to have a term with an understood meaning; it helps consumers out when they're comparing services to make sure that they're actually the same thing. Now, admittedly, some consumers can be a touch on the daft side -- a survey of American and European users recently revealed that 46 per cent of iPhone 4 users thought that the 4 meant it could access 4G networks -- but again that's a matter of education and clearly defined specifications.
If all this 4G talk makes your head spin, Lifehacker's primer on 4G may be useful. Equally, be happy you don't live in the US, where the term 4G has been bastardised to cover everything from HSPA+ upwards; there 4G has been rendered all but meaningless even as a marketing term, especially when you consider WiMAX is also in that mix. It's ugly out there in the rest of the world, people. We'd prefer it didn't get that ugly, and therefore that confusing here at home.
I can totally understand where Telstra's coming from with its 4G branding. It has a jump on its competitors, will presumably be committed to its 1800Mhz LTE rollout for the time being, and wants to own the 4G space ahead of Optus' claimed April launch and Vodafone's delayed 4G implementation. History is on Telstra's side here; it owns "NextG" as a term, even though it's just 850Mhz 3G, the same as now used by Vodafone in certain areas. Again, Telstra was there first and came up with a snappy term to match its network. Not so with 4G, though, which already has traction as a term, and would probably be impossible to trademark anyway.
There's also a risk for Telstra; if either Optus or Vodafone do manage to deploy 4G that runs faster than Telstra's in real-world usage, they'll have an easy advertising hook to hang the service on. That having been said, I've personally hit above Telstra's claimed 40Mbps rate on a number of instances; the proof will (as always) be in the head to head testing.
But what does this matter in terms of real world use? Well, let us return to the question I posed at the top of the article. When did high definition TV first kick off? Presuming you've not headed out to a big bad search engine in the meantime (or you're not a TV engineer), the chances are high that you'd peg it sometime around the year 2000. Those who like accuracy might like to note that 1996 saw the first US HDTV broadcasts, albeit only in an experimental capacity at that time.
Explain then, this photo, which I took recently at Alexandra Palace in London.
Can it be true? Did the boffins at the BBC pre-empt HDTV by nearly seventy years? Was the coronation broadcast in stunning 1080p?
No, it wasn't.
At the same time, though, that BBC plaque isn't quite lying.
The high definition it's referring to is a 405 line Marconi-EMI system. The competing technology at the time was the 240 line Baird system. That's Baird as in John Logie Baird, from whom we get the Logies, fact fans. Anyway, the plaque is historically correct, but at the same time it's functionally meaningless in terms of how "high definition" television is perceived today. A change in how you promote the meaning of a term gives completely different end results, in other words; stick anyone in front of a 405-line 1936-era TV screen, and they're highly unlikely to view it as high definition.
This is why we're so fussed about "correct" 4G. In a sense, yes, you could call it something else, be it 4G-LTE, 4G-LTE-A or 4G-Little-Fluffy-Bunny-Rabbits for what it's worth. But 3G is quite well understood, so it makes sense to use 4G to describe the next generation of services. If 4G is bastardised to mean many things, unless that's blindingly clear in the advertising for each and every product, we lose the functionality of the term. A consumer weighing up Telstra 4G versus Optus 4G versus Vodafone 4G versus Vivid 4G won't have any kind of simple shorthand way to compare, because they're not the same thing, even though they'll be labelled as such.