Get insider access
Preferred store
Your browser is not supported or outdated so some features of the site might not be available.
To try to better understand how long a TV should last, we're running 100 TVs through an accelerated longevity test for the next two years. We've just posted our 1-year video update with our latest findings on temporary image retention, burn-in, and more!

HDR10 vs HDR10+ vs Dolby Vision
Which is better?

Updated

High Dynamic Range (HDR) is a video technology that enhances the picture quality compared to regular content (see HDR vs. SDR). There are three main HDR formats: HDR10, HDR10+, and Dolby Vision, and they each display HDR content differently and have their advantages and disadvantages. There are other types of HDR formats, but HDR10, HDR10+, and Dolby Vision are the most widely used. When shopping for a new TV, you shouldn't worry too much about which formats it supports, because the TV's performance is much more important when it comes to the HDR picture quality. If you do want to get the most out of your favorite content, here are the different ways these formats deal with the key aspects of HDR.

Also, check out the best HDR TVs here.

Test results

Differences Between HDR10, HDR10+, and Dolby Vision

If you're comparing the three main HDR formats, there are a few things you need to look at, including color depth, brightness, tone mapping, and metadata. HDR10 is the most basic format out of the three, and any modern 4k TV supports HDR10. Dolby Vision and HDR10+ are the more advanced formats, and while many TVs have either HDR10+ or Dolby Vision support, some TVs support both, so they're not mutually exclusive. Below you can see the main differences between each format.

HDR10

What it is: Open standard for HDR.

HDR10+

What it is: Royalty-free standard for HDR.

Dolby Vision

What it is: Proprietary standard for HDR made by Dolby.

  HDR10 HDR10+ Dolby Vision
Bit Depth Good Great Great
Peak Brightness Minimum Good Good Great
Peak Brightness Maximum Excellent Excellent Excellent
Tone Mapping Good Better Best
Metadata Static Dynamic Dynamic 
TV Support Amazing Good Great
Content Availability Best Great Excellent

Bit Depth

Color bit depth is the amount of information the TV can use to tell a pixel which color to display. If a TV has higher color depth, it can display more colors and reduce banding in scenes with shades of similar colors, like a sunset. 8-bit TVs display 16.7 million colors, which is typically used in SDR content, and 10-bit color depth has 1.07 billion colors. 12-bit displays take it even further with an incredible 68.7 billion colors. Both Dolby Vision and HDR10+ can technically support content above 10-bit color depth, but that content is limited to Ultra HD Blu-rays with Dolby Vision, and even at that, not many of them go up to 12-bit color depth. HDR10 can't go past 10-bit color depth.

Winner: Tie between Dolby and HDR10+. Even if both HDR10+ and Dolby Vision can support content with higher bit depth above 10-bit, most content won't reach that, and streaming content is always capped at 10-bit color depth, so there's no difference between the two dynamic formats.

Peak Brightness

When it comes to watching HDR content, a high peak brightness is very important as it makes highlights pop. HDR content is mastered at a certain brightness, and the TV needs to match that brightness. So if the content is mastered at 1,000 cd/m², you want it to display content exactly at 1,000 cd/m².

HDR10

  • Mastered anywhere from 400 to 4,000 cd/m²
  • Technical limit: 10,000 cd/m²

HDR10+

  • Mastered from 1,000 to 4,000 cd/m²
  • Technical limit: 10,000 cd/m²

Dolby Vision

  • Mastered from 1,000 to 4,000 cd/m²
  • Technical limit: 10,000 cd/m²

Dolby Vision and HDR10+ content are currently mastered between 1,000 to 4,000 cd/m², with most content at around 1,000 cd/m². HDR10 can be mastered anywhere up to 4,000 cd/m², depending on the content, but it doesn't have a minimum brightness. All three standards cater for images of up to 10,000 cd/m², although no display can currently reach that level. Therefore there's no real difference between the dynamic formats as they both top out at 4,000 cd/m².

Winner: Tie between HDR10+ and Dolby Vision. Both HDR10+ and Dolby Vision are currently mastered between 1,000 to 4,000 cd/m², so there's no difference there.

Metadata

Metadata can be thought of as an instruction manual that describes various aspects of the content. It's contained alongside the series or film and helps the display deal with the content in the most effective way.

HDR10

  • Static metadata
  • Same brightness and tone mapping for the entirety of the content

HDR10+

  • Dynamic metadata
  • Adjusts the brightness and tone mapping per scene

Dolby Vision

  • Dynamic metadata
  • Adjusts the brightness and tone mapping per scene

One of the ways the three formats differ is their use of metadata. HDR10 only asks for static metadata. With static metadata, the boundaries in brightness are set once for the entire movie or show and are determined by taking the brightness range of the brightest scene. Dolby Vision and HDR10+ improve on this by using dynamic metadata, which allows it to tell the TV how to apply tone-mapping on a scene-by-scene or even on a frame-by-frame basis. This provides a better overall experience, as dark scenes won't appear too bright.

However, some TV manufacturers ignore the metadata, and the TVs use their own tone-mapping to master content, in which case the HDR format's metadata doesn't matter, and the performance comes down to the TV.

Winner: Dolby Vision and HDR10+. They are better at adapting to scenes that have very different lighting.

Tone Mapping

Tone mapping tells us how well a TV can display colors that it doesn't display. In other words, if an HDR movie has a bright red in a scene, but the TV can't display that particular shade of red, what does it do to make up for it? There are two ways for a TV to tone map colors to deal with it. The first is called clipping, where a TV gets so bright that you don't see details above a certain level of brightness, and there aren't any visible colors above that brightness.

The other common method is where the TV remaps the range of colors, meaning it displays the required bright colors without clipping. Even if it doesn't necessarily display the required shade of red, at least the image will still look good. There's more of a gentle roll-off as colors reach their peak luminance, so you don't lose any details, but the overall highlights are dimmer than on a TV that uses clipping.

Between the three HDR formats, the differences are how each TV deals with tone mapping. Dynamic formats like Dolby Vision and HDR10+ can tone map on a scene-by-scene basis, and sometimes the content is tone-mapped by the source, which saves processing power required from the TV. As for HDR10, since it uses static metadata, the tone mapping is the same across the entire movie or show, so content doesn't look as good.

Winner: HDR10+ and Dolby Vision. Dolby Vision and HDR10+ use dynamic metadata to change the tone mapping on a scene-by-scene basis.

Backwards Compatibility

Both HDR10+ and Dolby Vision are backward-compatible with static HDR formats on Ultra HD Blu-rays, so if you're watching older HDR content, you won't have to worry about which format it's in as your new TV will be able to display it. Dolby Vision and HDR10+ are both backward-compatible, but they use different technology to build upon older HDR formats. HDR10+ adds dynamic metadata to HDR10 content, so if an HDR10+ TV needs to display HDR10 content, it does so without the dynamic metadata. Dolby Vision is more complicated because it can use any static HDR format as a 'base layer' and build from it. Because it builds from static metadata, Dolby Vision TVs can read the static metadata alone, making it backward-compatible.

All Blu-ray discs need to use HDR10 as a static metadata layer. It means that it's backward-compatible with any TV; if it's a Dolby Vision disc and your TV only supports HDR10+, it'll play the movie in HDR10 instead. However, the same can't be said about streaming content because a Dolby Vision movie on Netflix might not carry the HDR10 base layer, so if your TV doesn't support Dolby Vision, it will simply play in SDR instead.

TVs that don't support a specific format

If your TV supports Dolby Vision or HDR10+, but not both, you'll be limited to the type of HDR content you can watch in that format. If a TV doesn't support the HDR format your Blu-ray is in, it will be limited to HDR10, so you can't watch the content in the intended format. For example, Samsung TVs don't support Dolby Vision, so any Blu-ray in Dolby Vision will be limited to HDR10, and if you're streaming a Dolby Vision movie that doesn't have the HDR10 base layer, the content will be in SDR. TVs that support both formats have an advantage, and you'll see content in their proper dynamic format.

Availability

Supported Devices

The availability of the new HDR formats has drastically improved in recent years. All HDR content is at least available in HDR10, and Dolby Vision is available with most streaming services. Although not as common, HDR10+ is growing in popularity with Blu-rays and certain streaming services like Amazon Prime Video. As of October 2022, Apple now supports HDR10+ on the Apple TV+ app, and all of their HDR content has been updated with HDR10+ metadata. Find out where to find HDR content here.

Winner: HDR10 and Dolby Vision.

Supported TVs

While most TVs support HDR10 and many models support at least one of the more advanced formats, only a few brands like Vizio, Hisense, and TCL have support for both on their TVs. In the United States, Sony and LG support Dolby Vision, while Samsung TVs have HDR10+ support.

You shouldn't expect the cheaper HDR TVs to use all the extra capabilities of the formats. For most of them, you won't even be able to see a difference, as only high-end TVs can take advantage of HDR and display it to its full capabilities.

Winner: HDR10.

Gaming

  HDR10 HDR10+ DV
PS4/PS4 Pro Yes No No
PS5 Yes No No
Xbox One Yes No Yes
Xbox Series X/S Yes No Yes
Nintendo Switch No No No
PC Yes Yes Yes

Although HDR was initially for movies, the advantages for gaming are undeniable. Modern consoles like the Xbox One and Xbox Series X both support Dolby Vision. Like with movies, game developers have to enable HDR support in their games. There are a handful of Dolby Vision games available for the PC and consoles, including Borderlands 3, F1 2021, and Call of Duty: Black Ops Cold War, to name a few. HDR10+ Gaming is an expansion of HDR10+ to focus on gaming, and PC gamers can take advantage of this, especially if you have a Samsung display, but consoles will stick with Dolby Vision support. Unfortunately, HDR isn't always implemented properly, so the actual performance varies.

See our recommendations for the best 4k HDR gaming TVs.

Winner: Tie between Dolby Vision and HDR10+. Dolby Vision games are more widely available than HDR10+ games, especially when it comes to consoles, but HDR10+ is slowly making its way into the PC gaming world.

Monitors

The vast majority of them support HDR, but this doesn't mean that they're good for HDR, as they're behind TVs in that regard. Most monitors only support HDR10 and not HDR10+ and Dolby Vision, so you don't get dynamic metadata, and they usually have low contrast and low HDR peak brightness. If you want the best HDR experience possible, watch content on a TV.

Winner: HDR10

HLG

Dolby Vision, HDR10+, and HDR10 aren't the only HDR formats. There's also HLG, also known as Hybrid Log Gamma. All modern TVs support it, and HLG aims to simplify things by combining SDR and HDR into one signal. It's ideal for live broadcasts, as any device receiving the signal can play it. If the device supports HDR, it will display it in HDR; if it doesn't, the SDR portion of the signal is played. As it's intended for live broadcasts, there's very little HLG content available.

Conclusion

Between Dolby Vision and HDR10+, there's no clear winner from a technical standpoint because they both use dynamic metadata to help improve the overall quality. HDR10+ almost matches the capabilities of Dolby Vision but is lacking in content, and not as many TVs support HDR10+ as Dolby Vision. HDR10 has the distinct advantage of having more content available and being supported by every 4k TV.

Ultimately, the difference between the three formats isn't that important. The quality of the TV itself has a much bigger impact on HDR. Both formats can produce much more dynamic images than what we used to see, and HDR delivers a more impactful movie experience as long as the TV displays it properly. There are limitations with HDR, though, because TVs can't reach the 10,000 nit peak brightness and all the colors HDR is capable of, but most TVs still deliver a satisfying HDR experience.