High Dynamic Range (HDR) technology has become an important term to consider when purchasing a new TV or streaming TV shows and movies because HDR technology makes content more vibrant and brighter.
While there are five different HDR formats, we’re going to focus on the top three in terms of widespread availability: HDR10, Dolby Vision, and HDR10+. Each one of these formats brings something different to the table and could enhance your viewing experiences dramatically. This guide will show you the difference between the formats and how these differences might affect your buying decision regarding a new TV or streaming device.
What is HDR?
HDR represents the unlocking of more colors, higher levels of contrast, and significant brightness. With a broader color gamut over Standard Dynamic Range (SDR), HDR opens up more hues that TVs could not produce for so many years. As a result, HDR allows for more realistic content to be presented on today’s 4K TVs. SDR can only present a fraction of the color depth that HDR can. For example, SDR displays can only showcase 256 shades of red, green, and blue, while HDR can showcase 1,024 shades.
For HDR content to be displayed, an HDR-capable TV and a source of HDR content, be it a media player like a Blu-ray player or a streaming service like Netflix or Hulu, is needed.
Further reading
- 4K TV buying guide: Everything you need to know before you go shopping
- HDR TV: What it is and why your next TV should have it
- UHD and HDR: Everything you need to know
- What is Dolby Vision? The dynamic HDR format fully explained
- What is HDR10+? Everything you need to know about the new HDR format
Image quality
HDR’s big advantage is better image quality. Whether you’re watching the latest Marvel film or playing a game on your Xbox console, better image quality will improve your overall experience. But how exactly does HDR deliver that improved picture quality when compared with SDR? The answer lies in three elements: Bit depth, brightness, and metadata. Let’s take a quick look at each one and how the different HDR formats use them.
Bit depth
Bit depth describes the number of colors a movie or TV show includes as well as the number of colors a TV can display. Each pixel of your TV is made up of three discrete colors: Red, green, and blue (RGB). Each of these colors can be broken down into hues. The greater the bit depth, the greater the number of hues you get, and thus the greater the number of colors.
SDR content, for instance, uses a bit depth of 8 bits. Eight bits allows for up to 256 hues of R, G, and B. If you multiply 256 x 256 x 256, you get 16.7 million possible colors. That sounds like a lot, but when you look at HDR10 and HDR 10+ formats — which make use of 10 bits and can display up to 1.07 billion colors — it’s easy to see that HDR is much more colorful. Dolby Vision takes that up a notch with 12 bits, for a maximum of 68.7 billion colors.
While TVs that can handle 10-bit color are quite common, there are no TVs that support 12-bit color yet. So Dolby Vision’s massive color advantage is going to be moot for the time being.
Brightness
TV brightness is measured in candelas per square meter (cd/m²) or nits (1 nit is roughly equal to 1 cd/m²). When it comes to peak brightness, Dolby Vision takes the crown. It can support display brightness of up to 10,000 cd/m², whereas both HDR10 and HDR10+ max out at 4,000 cd/m². For now, you’ll be hard-pressed to find TVs that can deliver anywhere close to 4,000 cd/m², but every year TVs get brighter, and this makes Dolby Vision more future-proof than other formats.
Metadata
Metadata, in the context of HDR, is an extra layer of information that tells a TV how it should display the content it’s receiving. This information covers things like peak brightness, contrast, and something known as tone mapping, all of which contribute to making HDR video look so much better than SDR. However, not all HDR formats use the same kind of metadata. HDR10 uses static metadata, which means the information governing brightness, contrast, etc., is delivered to the TV at the beginning of a movie or TV show and doesn’t change until you switch to a new movie or TV show.
Dolby Vision and HDR10+, on the other hand, use dynamic metadata. This allows each scene — or even each frame of video — to be properly adjusted for the best results.
Availability
HDR10 is considered the default HDR format. This means that if a movie is presented in HDR, a streaming media device claims to support HDR, or a TV is marketed as an HDR TV, they will all support HDR10 at a minimum. This near-universal support puts HDR10 heads and shoulders above Dolby Vision and HDR10+ when it comes to availability both in content and devices.
But Dolby Vision, once considered a hard-to-find premium option, is quickly catching up to HDR10. You’ll find Dolby Vision support on plenty of HDR TVs with the exception of Samsung TVs. Samsung continues to be the one TV maker that refuses to pay Dolby’s licensing fees for Dolby Vision. Dolby Vision content is also becoming more commonplace. You’ll find it on UHD Blu-ray discs as well as streaming services such as Disney+, Apple TV+, Netflix, and Amazon Prime Video, just to name some of the most popular options.
Pulling up the rear is HDR10+. Despite its royalty-free licensing, it has the lowest adoption among TV makers. In the U.S., you’ll find it on all Samsung HDR TVs and select Vizio and Hisense models. Elsewhere in the world Panasonic, TCL, Toshiba, and Philips sell HDR10+-capable TVs. And while your TV may support HDR10+, finding content in this format could prove challenging. Right now, Amazon Prime Video is one of the few reliable sources of HDR10+ material.
Conclusion
With better brightness, color, and the benefits of dynamic metadata, Dolby Vision is clearly the best HDR format. It’s supported on TVs from LG, Vizio, TCL, Hisense, and Sony, and you can find it on an increasing number of the top streaming services.
However, HDR10, as the de facto HDR format, remains the most accessible of the three both from a content and device point of view. HDR10+ may offer many of the same benefits as Dolby Vision, but its slow adoption among TV makers and content creators/distributors makes it more of a footnote in the HDR space than a truly viable alternative to HDR10 and Dolby Vision, though that could easily change in the future: There are few if any roadblocks to its broader adoption.
But here’s the good news: HDR formats aren’t mutually exclusive. If you buy a TV that doesn’t support Dolby Vision, you can still watch HDR10 (or HDR10+ if applicable). And because streaming providers can offer multiple versions of HDR per movie, your TV should automatically receive the highest quality format that it supports. For instance, if you stream a Marvel movie from Disney+ that is available in Dolby Vision, your TV will still get the HDR10 version if that’s the best format it can display.