Scan's TekSpek

Our Aim
To provide you with an overview on New And existing technologies, hopefully helping you understand the changes in the technology. Together with the overviews we hope to bring topical issues to light from a series of independent reviewers saving you the time And hassle of fact finding over the web.

We will over time provide you with quality content which you can browse and subscribe to at your leisure.

TekSpek Monitors
HDR TVs

HDR TVs


Date issued:

Television manufacturers tend to focus on a particular buzzword or acronym almost every year. In the past, we have been treated to HD (high definition), denoting that the set is capable of running a minimum 720p resolution. 3D came and went, UHD (ultra-high definition), also known as 4K, is now very much in vogue on larger sets. Another popular one recently is HDR, or high dynamic range. In this TekSpek we explain what HDR is and then compare the various competing formats so that you can make the most-informed purchasing decision.

What is HDR ?

HDR aims to improve the image quality of televisions by increasing the overall brightness, enabling a more noticeable colour gamut, and having, as the name intimates, a bigger dynamic range between whites and blacks. Put simply, a good HDR interpretation makes video look far more lifelike and more closely mimics the light spectrum a human eye can actually perceive. There is significantly more 'pop' in a quality HDR picture than on a standard SDR set.

The primary part of the HDR jigsaw is brightness. Most non-HDR sets, typically budget offerings or those manufactured before 2015, output a maximum brightness of around 250 nits, where a single nit is defined as the intensity of one candela per square metre. This is why you may see this figure reported as cd/m². Through vastly improved backlighting premium LCD TVs can deliver brightness figures in excess of 1,000cd/m², with next-generation models doubling that to 2,000cd/m². Such retina-searing brightness is far more adept at replicating strong sources of light such as sunsets or car headlamps.

Yet brightness remains a single facet of HDR. Another is the ability to produce extreme brightness and near-darkness in the same scene. Imagine a night time video where a super-bright street lamp illuminates our protagonist. The lamp's light range only goes so far, covering our hero's face and body, though much of the scene's periphery remains in either in shadow or darkness. A high-quality HDR set is able to concurrently show these two ends of the spectrum without compromise. The luminance difference between the whitest whites and deepest blacks is known as the contrast ratio, and the higher the better. A quality set is able to produce a 20,000:1 ratio, which means the peak whites are 20,000 times brighter than the deepest blacks. Of course, in an ideal world, you would want an infinite contrast ratio.

LG HDR TV

An example of a premium HDR telly - the LG 65B7V

Lastly, a decent HDR set ought to be able to produce a massive array of colours on top of extreme brightness and great contrast. A typical 8-bit panel is able to produce a potential 16,777,216 colours, or 256 shades each for the red, green and blue values. This has been the colour standard for some time. True 10-bit panels are able to produce 1,024 shades of each primary colour, meaning a total of over a billion displayable colours (1,073,741,824). This 64x improvement of 8-bit is most keenly felt in scenes where there is a very gradual colour shift - a lovely sunset, for example.

Some HDR televisions use a method called 8-bit plus frame-rate control (FRC). By using a cheaper 8-bit panel and then employing a dithering algorithm - the FRC bit - to blend the colour of adjacent pixels, these sets artificially raise the number of viewable colours. Full-on 10-bit is always better than 8-bit plus FRC, which is better than 8-bit.

Okay, so HDR requires a super-bright screen, great contrast ratio, and enough colour depth for seamless gradient changes. Seems easy to standardise, right? Not quite. At the time of writing, there are at least four HDR flavours doing the rounds: HDR10, HDR10+, Dolby Vision and HLG. Each have their pros and cons.

HDR10

Ultra HD Premium

This is where it becomes nebulous. TV manufacturers launched the initial raft of HDR-ready sets without paying too much attention to actual specifications. If the TV was pretty damn bright, well, that was good enough for it to be HDR-ready. Better-than-average contrast ratio? Yup, HDR-ready. This meant that there was significant discrepancy between sets, and whilst all were better than SDR at providing an immersive picture, the consumer never really knew too much of the underlying technology.

That, to some extent, is still the case in 2018, The simplest way to know the actual quality of HDR on a set is to look for the Ultra HD Premium logo, backed by the UHD Alliance, and one that sets hard-and-fast rules about what constitutes minimum HDR compliance.

Ultra HD Premium compliance means the TV set needs to have a minimum 3,840x2,160 resolution (4K, in other words), which is unrelated to HDR. More importantly, the set needs to be able to process the 10-bit colour we spoke about above, though it doesn't need to be able to display them all. In terms of colour, sets need to display at least 90 per cent of what is known as the DCI P3 spectrum, which is richer and wider than either sRGB or AdobeRGB commonly used for computer monitors. On top of that, it needs to have excellent brightness, equating to either 1,000 nits peak and less than 0.05 nits for blacks (20,000:1 contrast ratio, therefore), or 540 nits peak and less than 0.0005 nits black level. The two standards are there to accommodate LED and OLED technologies, respectively.

Achieve all that and your Ultra HD Premium set will have the chops to produce really decent HDR imagery and video. Do be aware that only the best LED TVs and top-tier OLEDs can lay claim to such video excellence. Anything less may claim HDR status, of course, but the specs may well be much lower... and you will probably never find out just what they are.

HDR10+

HDR 10+ TV

Building upon HDR, Samsung and Amazon have collaborated to release royalty-free HDR10+. What's the difference, you might ask? Good question. Normal HDR10 is coded in such a way that an entire movie's brightness levels are set at the very start. These levels indicate absolute maximum and minimum brightness, so if a particular scene is very dark, albeit with the odd lighter colour, only a small brightness range will be applied to it.

HDR10+ uses what is known as dynamic metadata to set the minimum and maximum brightness on a per-scene basis, giving the TV more-precise luminance information at every step. Going back to the example, even a dark scene can have a wider range of colour gradients because HDR10+ takes specific scene context into account. The upshot is, if done well, even punchier HDR that extracts maximum detail and accurate brightness from every scene. Just take a look at the above picture for confirmation.

Dolby Vision

A direct competitor to HDR10+, Dolby Vision also uses dynamic metadata to optimise HDR for a film or show. The key difference is that TV manufacturers need to pay Dolby a fee for their technology. That said, Dolby insists on a higher contrast ratio and, potentially, a wider colour gamut than either HDR10 or HDR10+. There's also scope within Dolby Vision for 12-bit panels, something which HDR10+ doesn't currently support. The good news for consumers is that both Dolby Vision and HDR10+ have significant roots in HDR10, meaning that it is relatively straightforward for content providers to offer both for their material.

HLG

Hybrid Log Gamma (HLG) is perhaps the most pertinent of all the HDR formats as it will be most widely used in the broadcast world. Born from a collaboration between national broadcasters BBC (UK) and NHK (Japan), HLG is designed as an easy-to-implement system that carries both SDR and HDR content in one stream, meaning that content is screen-specific - you get one or the other stream depending upon TV spec. HLG streams can be compressed using industry-standard VP-9 (Google) or H.265 formats, and having a single stream, rather than separate ones for SDR and HDR, further reduces bandwidth.

However, HLG won't simply work on any HDR telly. Rather, the manufacturer will need to ensure their sets meet the HLG log curve standard. It is this curve alongside a gamma curve that sets the upper and lower half of the signal, and these two curves are combined to give full-on HDR.

It's worth appreciating that whilst backed by big players, HLG is very much in its infancy. One can successfully argue that HLG, or any subsequent equivalent, will have the biggest impact upon HDR adoption in the next two years.

Wrap

Unlike 3D, HDR is going to become pervasive in the TV world in the near future. Offering searing whites, deep blacks, and a bountiful colour palette, seeing a well-optimised HDR telly next to a standard 4K is, quite literally, illuminating. HDR offers a far more lifelike, immersive picture if done well.

Any decent TV ought to have compatibility with different variants of HDR, and one way to know a reasonable minimum specification is to look for the Ultra HD Premium logo. If that's not present, any set touting HDR10+ or Dolby Vision ought to also guarantee a decent HDR experience. And if that's not enough, find out if the set has a peak brightness of over 1,000 nits, contrast ratio in excess of 20,000:1, and wide colour gamut. If it does, you will have a good HDR experience.

Yet the biggest obstacle facing adoption is not that of technical merit for LED or OLED TVs. It rests with broadcasters adopting HDR quickly. Netflix and Amazon have already begun using and streaming it, and it is reasonable to assume that national broadcasters will do so in the next two to three years. Games manufacturers, too, have also jumped on the HDR bandwagon, with Xbox One and PlayStation 4 both capable HDR exponents.

The future is bright, the future is HDR.