Back in my day, there was only one HDR. You tied an onion to your belt, which was the style at the time, and you bought a TV that said it could play HDR video. You found a HDR Blu-ray or a HDR video on Netflix or YouTube, and you watched it. But these days, you kids, with your fancy fandangled Dolby Vision and Technicolor and your Hybrid Log Gamma...
Last year, when HDR was new(er), and the basic open HDR-10 standard was the only one that was publicly available, I said it was a great reason to buy a new screen -- whether it was a LCD TV or an OLED. I was wrong. It was a silly thing to say in hindsight. It's still a great reason, but it's also not as simple as just choosing a TV that has those three letters in its product description.
LG's new top of the line OLED TV is one of the first to be certified by Technicolor -- 'member Technicolor? -- for a new standard for HDR video playback. Technicolor HDR improves upon its competitors It's apparently an open standard, too, so TV manufacturers other than LG can jump on board if they're keen too.
But there are HDR standards that aren't open, too. Dolby Vision requires its own hardware chip to operate, and requires TV manufacturers to work closely with Dolby for certification throughout the build process. Dolby Vision is the most technically adept of the HDR standards, but as well as requiring hardware-certified TV and Blu-ray players in your home entertainment setup it'll only be available on a smaller, select list of streaming titles and upcoming Ultra HD Blu-ray discs.
And then there's Hybrid Log Gamma. HLG HDR was co-developed by the BBC and Japanese broadcaster NHK -- both of which are huge players in 4K content and beyond. It was designed with content in mind that isn't the blockbuster movies and high-production-value Netflix series that we've been seeing as the poster children for current HDR. It's meant to deliver one stream of HDR video to many different displays, with the display interpreting that signal and displaying the widest range of contrast possible.
HDR-10 is the simplest of the HDR standards -- it only chops movies or content up into a series of discrete scenes, where each has customised maximum and minimum peak brightness levels. Where Dolby Vision is much more granular, pushing brightness and contrast information for each individual frame, HDR-10 is basically a one-size-fits-all. Dolby Vision also works on streaming content that isn't Ultra HD, which is great for dodgy Aussie 'net connections, but HDR-10 is restricted to 4K.
All of the current HDR standards offer something slightly different, which is both a good thing, and part of the problem. Hybrid Log Gamma is an open standard and is technically much more feasible to be delivered through existing free-to-air broadcast TV setups. HDR-10 is a base level for everyone to achieve at the very least. Dolby Vision is the creme de la creme. Technicolor HDR is a new entrant into the fight that might turn out to be something useful and special, but it's too early to tell.
If you buy a top of the line TV in 2017, chances are it will support at least three of these competing and complementary HDR standards. (Dolby Vision is fast emerging as the dark horse in this race.) But what happens in 2018, when my hypothetical Dolby Vision Pro or HDR-11 appears? It's like the silly VHS versus Betamax race all over again. And consumers -- especially early adopters -- are the ones that will miss out as a result.
These are things that need to happen -- change is never easy, and it's often messy -- but god, wouldn't it be good if everyone could just sit down together and for once decide on a single standard, like 4K, for HDR? It'd make my life easier, at least.