A combination of fractured development, belated standardization, over-eagerness and using it as a weapon in bitter OLED vs LCD feuding have conspired to make the mere mention of the letters ‘HDR’ enough to cause your average consumer’s eyes to glaze over.
Even today, despite us now being faced with a second generation of ‘HDR-capable’ TVs, the AV world still seems hell bent on piling more confusion and obfuscation around its latest bid to get us all to recycle our TVs again.
So the plan here is to try and cut through all the jargon and bluster and make HDR accessible to anyone. After all, when you strip away the techno-babble the only thing that truly matters about HDR is that it can deliver TV picture quality far beyond anything seen before – so it would be truly sad if consumers decided to shun it just because they didn’t understand it.
If you’ve got a reasonably recent smartphone you’ve probably already come across the concept of HDR from a photography perspective. However, HDR for still photography is a very different kettle of fish to HDR for video.
When it comes to HDR in still photography, your camera takes multiple exposures of the same shot and combines them together to make the end result look more dynamic. With HDR video you’re actually capturing in the camera and then retaining during the video mastering and delivery stages a much greater brightness range.
The result is pictures that enjoy far more dynamism and contrast, and contain much more of the visible light spectrum and low-light detail our eyes see in the world around us every day.
The difference between standard and high dynamic range images really can be spectacular – you certainly don’t have to have a trained eye or a TV the size of a small country to appreciate it.
The current target for HDR TVs is to go far beyond the color range of the old TV picture standards we’ve been stuck with for decades and reproduce a color range that gets as close as possible to what you see in high quality commercial digital cinemas. So as well as reproducing pictures on your TV that look more like real life, HDR enables you to see films at home that look much more like they were designed to look at the cinema. Though actually, HDR has the built-in potential to go far beyond the current cinema color standard as technological advances permit.
What you need to watch HDR
HDR is an end to end technology. The source pictures need to be created containing HDR-friendly color and brightness ranges; the distribution method needs to retain all the extra HDR brightness and color information; and the display device you’re watching HDR on has to be capable of reading and managing the HDR data, as well as – hopefully! – having sufficient brightness and color capabilities to deliver at least some of HDR’s picture quality benefits.
In short, if you want to watch HDR at home you’ll need an HDR-capable TV and at least one HDR-capable source.
What HDR sources are there?
The recently launched Ultra HD Blu-ray format supports HDR, and every disc released to date includes an HDR version of the film it contains.
Please note that only Netflix and Amazon currently offer HDR services outside of North America.
Does all HDR look the same?
No. For starters, film-makers can have very different visions of how dynamic a particular movie should look, in terms of its ‘baseline’ brightness and how far into the darkest and brightest extremes of HDR’s expanded light range its pictures should venture.
When it comes to getting HDR out to consumers, there are different types of HDR distribution channels available that work to slightly different standards and in different ways that can also potentially have an impact on the HDR picture you experience.
Finally, there can be potentially big differences in the way different TVs deliver HDR, based on everything from the inherent brightness, contrast and color capabilities of their screens to the types of HDR they support and the quality of their video processing.
Let’s look in more depth at some of these HDR differentiators.
Different types of HDR
The AV industry took so long to formalize/standardize HDR that you’d be forgiven for thinking there’s an almost infinite number of different HDR formats out there. Fortunately, though, as things stand today there are really only two HDR approaches you need to worry about as an end user, since in practice relatively small HDR variations tend to just be handled in the background by a TVs’ software.
Open Standard HDR
Pretty much any TV that claims to be HDR capable will support the open HDR standard developed by the Society of Motion Picture and Television Engineers. This standard also has to be supported on Ultra HD Blu-rays that use HDR, and the expectation is that the vast majority of streaming services that offer HDR will also support the open HDR standard.
Certainly that’s the case with Netflix and Amazon, though VUDU has thrown something of a spanner in the works by only offering HDR on its platform using the Dolby Vision system. Which brings us to…
Dolby Vision has, as you can probably guess, been developed by Dolby Laboratories. It differs from the open HDR standard in a number of ways, but the main four are a) that any delivery system or display hardware that wants to support it has to pay a license fee to Dolby; b) that any Dolby Vision hardware needs a dedicated Dolby Vision decoder chip inside; c) that color is mastered to 12 bits rather than open HDR’s 10 bits; and d) that it adds an extra ‘layer’ of information to the main HDR picture data which optimizes the way pictures look on a scene by scene basis for whichever Dolby Vision-capable TV they’re being watched on.
The need for a dedicated Dolby Vision chip in Ultra HD Blu-ray players or TVs/projectors that want to support it means that unlike open HDR it’s not nearly so widely available; at the time of writing no Dolby Vision Ultra HD Blu-ray players exist, and only select TVs by LG, Vizio and TCL support it.
Philips TVs in America are due to implement it too, though, and I suspect other brands will come on board in the months and years to come. You’ve got to think that LG’s upcoming debut Ultra HD Blu-ray player will support Dolby Vision as well, given that the brand’s latest TVs do.
Dolby Vision also requires special mastering of films and TV shows, and at the time of writing there are no Ultra HD Blu-ray releases out there that support the Dolby Vision format. However, Universal has already pledged to include Dolby Vision options on its future UHD BD releases, and as I’ve already mentioned, VUDU exclusively supports Dolby Vision for its HDR streams.
Why TVs don’t all look the same with HDR
The differences in how effectively TVs reproduce HDR can be pretty extreme for a number of reasons, making choosing the right TV the single hardest part of the HDR story from a consumer’s point of view.
At the heart of the problem is the fact that so many different picture factors go into creating HDR images that it’s really difficult for current TV technologies to nail all of them at once.
Some screens can deliver more of the new color range associated with HDR, for instance, while others do better at delivering more of the extra brightness and/or contrast associated with HDR’s expanded luminance capabilities. Some TVs and TV technologies are better than others at controlling how locally HDR’s extreme brightness peaks are reproduced, and some do better than others at ‘remapping’ the full range of color and brightness range information an HDR source may contain to the usually relatively limited capabilities of their screens.
So how do I figure out which TVs are the best for HDR?
The AV industry has tried to help here by coming up with a series of recommended TV specifications for HDR playback. These include minimum color, brightness, resolution and contrast requirements, and any TV (or Ultra HD Blu-ray player and Ultra HD Blu-ray disc, actually) that hits or surpasses the full array of recommendations can earn an ‘Ultra HD Premium’ badge to use on its marketing materials and packaging. For more details on this, check out this separate story.
This means there are TVs out there – such as Sony’s XD930 and XD940 models – that are apparently capable of earning the Ultra HD Premium badge but which have chosen not to pursue it.
The other issue with the Ultra HD Premium logo is that to some extent it’s just a numbers game. By which I mean that because there are so many interconnected picture quality traits associated with delivering HDR convincingly, it doesn’t necessarily follow that a TV sporting the Ultra HD Premium badge will always deliver a truly stellar HDR performance, even if it looks on paper like it should.
Particularly problematic for LCD TVs, for instance, is light pollution. Since LCD TVs rely on external lighting systems rather than each pixel producing its own light independently, the appearance of bright objects against dark backgrounds in an HDR picture can cause distracting problems like streaks of light running right down the screen or glowing halos around the bright object. For more on this, see my recent reviews of the Samsung UN55KS9500, Sony 65X930D and Panasonic 50DX750.
With this LCD problem in mind, LG’s OLED TVs, where every pixel produces its own light, look like better bets for HDR. Yet OLED screens are not currently capable of hitting the same brightness peaks that LCD ones can, meaning they can find it harder to remap HDR’s most extreme brightness peaks effectively to their screen capabilities.
Do you need faster broadband to stream HDR?
Only slightly. One of the best things about HDR is that unlike 4K/UHD resolution, it adds relatively little data to a video stream. Amazon actually claims that HDR adds no significant data bandwidth requirements to its streams, while Netflix claims HDR adds between 10 and 20 per cent extra streaming bandwidth. So given that Netflix claims you should have at least 25Mbps of broadband speed for stable Ultra HD streaming, you’re looking at needing around 30Mbps for stable UHD HDR streams.
What happens if you watch standard dynamic range video on an HDR TV?
Many HDR-capable TVs – especially the relatively high-end ones – feature ‘remastering’ systems capable of expanding the colour and brightness range of standard dynamic range sources to ‘fill’ the HDR-friendly capabilities of their screen. This is a very difficult thing to achieve successfully, though; unless a TV’s processing systems are exceptional you may see such issues as color balance problems, unnatural color tones, bleached whites, forced and detail-light dark scenes, plus exaggerated video noise.
I’m not saying you should deactivate an HDR TV’s SDR-to-HDR conversion systems by default; some TVs really can do a better job of upconverting SDR than you might think. If, however, you find yourself feeling acutely aware of any of the SDR upconverting issues listed above then it’s worth at least experimenting with turning all or some of the upconversion circuitry off.
What’s next for HDR?
With content mastering standards for HDR largely set and appearing to contain many years worth of future proofing, I’d expect the main areas of improvement/change for HDR to take in place in the TV hardware world.
We can probably expect a ‘nit race’, for starters, where TVs push for ever higher levels of brightness with which to unlock more of the potential contained in the HDR video sources, most of which are mastered to much higher brightness standards than anything that can be achieved by today’s TVs.
how locally light appears on the screen. This may extend to more brands adopting OLED technology – though as discussed earlier, OLED has its own brightness challenges to overcome.
There will also be a push by the TV makers to expand the color range their TVs can achieve. At the moment the focus is on screens reaching 100% of the color range used in digital cinemas, but the HDR format includes the potential to extend way beyond that if film makers want to go there.
Also likely to undergo a period of change is the way films and, especially, TV shows are shot, as their makers start to experiment with the new story-telling options HDR makes possible.
Source: John Archer / Forbes