Hey there 👋
We’re back with another spec dive. Screens are everywhere -TVs, laptops, phones, and the marketing noise around them can get overwhelming fast. Knowing what actually matters (and what’s just sticker hype) makes all the difference.
Now, we get it. Articles like these aren’t clickbait or “hot takes.” But if you’re the kind of person who likes picking up something new each week, this is exactly why we do them.
It’s also why we started the What The Tech series - for folks who want the quick news updates. Pieces like this one? They’re for when you want to slow down and actually understand the tech behind the buzzwords.
That being said, let’s get into it.
If you happen to pass across a TV showroom these days, you would see that 4K UHD and HDR are the two things that are on top of everyone's minds. You just can't escape the yellow HDR sticker.
From disc covers to console boxes and streaming platforms, almost everyone is talking about HDR. When something is discussed by a large group of people naturally we have some confusions around the subject and HDR is no different.
That's why in this article today we'll explore what HDR really is, who it is useful for, is it even worth it for an average consumer and what's the future outlook of this technology.
What is High Dynamic Range (HDR)?
High Dynamic Range aka HDR is a technology that is supposed to enhance our viewing experience on TVs by allowing us to have a wider range of colors and an improved contrast. HDR is supposed to deliver a much more natural and realistic viewing experience
For an average person, the HDR experience means that you'll have more detail in the content you consume. You'll experience better colors with deeper shadows and brighter highlights in whatever content you're watching in HDR.
When compared to the traditional Standard Dynamic Range (SDR), HDR offers many pronounced colors that don't just pop but also look more bright and detailed; they feel closer to reality. For example: The fireworks would be brighter and the shadows would be more pronounced in a midnight scene.
HDR vs SDR
Traditionally, most of the content that we consume is built for older displays that have a limit to how bright and dark they could get. They have a limited dynamic range - the maximum brightness and darkness the display could achieve.
In addition to a lower dynamic range, Older displays also had a limited range of colors they could display and they didn't have any metadata about the content they display. Metadata means more tailored information about the content being played.
HDR came as a challenge to Standard Dynamic Range or SDR when display technologies started to get better. Older displays had a pretty limited dynamic range (the max brightness and darkness the screen could achieve). The newer displays simply don't have these limitations.
The newer displays not only had better contrasts but better colors too and this made way for HDR as the standard that incorporates improved contrast, better colors and more metadata about the specific content being consumed.
The Science Behind HDR
There are 4 main ways by which HDR builds upon the Standard Dynamic Range (SDR):
Brightness
Color
Bit-Depth
Metadata
Brightness
HDR displays are able to achieve a significantly higher peak brightness level than the traditional displays that don't support HDR. Non-HDR displays have a significantly cut down peak brightness level.
Older monitors had a lower peak brightness level that hovered somewhere between 200 nits to 400 nits but common HDR displays start from 500 nits and easily go beyond 2,000 nits. HDR displays easily have 2 to 4 times the peak brightness level of a non-HDR display.
This stark difference in peak brightness level allows HDR displays to produce a significantly more detailed image than the traditional lower peak brightness displays.
Color
A TV's color gamut refers to the range of colors that it can display and HDR displays have a significantly wider color range when compared to SDR displays.
A wider color range means that the colors that the display puts out would feel more real and natural to the eye simply because the selection of available colors gets larger.
The two most commonly used color spaces for HDR displays are DCI-P3 and Rec. 2020. Both are just a range of colors and are used to show how many colors a display supports. DCI-P3 is standard color space for digital cinema while Rec. 2020 is the ultimate color space for HDR TVs.
The DCI-P3 color range is smaller which means that the Rec. 2020 color range can display more colors but even if we compare the DCI-P3 coverage to SDR's Rec. Rec.709, the difference is significant and nearly all HDR monitors can easily cover Rec. 709.
Bit-Depth
Traditionally we use 3 colors to display images on the screen. The three colors are Red, Green and Blue. For each of these colors we use a value from 0 - 256 on an 8-bit depth.
This 8-bit depth is used normally by SDR displays; this means that we have a limit of roughly 16 million available colors. Take that up to 10-bit depth and we have roughly more than 1 billion available colors.
The newer HDR displays have a 10-bit depth which means that we can get 2 raise to power 10 that gets us 1024 possible combinations for each of the three colors. This means we get a significantly larger pool of possible shades of colors to choose from on HDR displays.
HDR 10 uses a 10-bit depth but other more advanced HDR formats like Dolby Vision use 12-bit depth greatly increasing the 10-bit depth. Most of the HDR content however is made to utilize 10-bit depth.
Metadata
Something that SDR didn't have access to was dynamic metadata. SDR didn't have content specific metadata that it could use to tailor the display output but that is something that HDR has.
HDR content has additional signals that are present with the HDR content. This metadata included instructions that tell the specific information to the display about the content
Because of this metadata, the HDR displays get the specific insight to properly produce the HDR content by changing the specific brightness levels and color on the display.
Discussing HDR Formats
HDR has a number of formats that have some minor and significant differences between them. Let's discuss each of the major one HDR formats in detail:
HDR10
This is the most widely implemented HDR format. HDR10 is open-source and royalty-free which means that display manufacturers don't have to pay for any license and that's what makes HDR10 the most popular.
HDR10 has a 10-bit depth and it is a static HDR implementation which means that the color offsets and brightness levels don't change per-frame of the HDR content that is being played.
HDR10 being the baseline spec that is pushed by UHD Alliance and is the most widely supported HDR format. Chances are that if you have a 4K UHD TV, it might already support HDR10.
Dolby Vision
Dolby Vision is the proprietary HDR format developed by Dolby. It is a dynamic HDR format that does have specific instructions for each form of the content that is being played.
Since Dolby Vision requires manufacturers to pay royalty it isn't as widely supported as HDR. Dolby Vision supports 12-bit depth that makes it more advanced than the standard HDR.
Although Dolby Vision supports a theoretical max of 10,000 nits displays and 12-bit depth, most of the content that we have uses 10-bit depth for colors.
HDR10+
As a response to Dolby Vision, Samsung, Panasonic and 20th Century Fox made a joint effort and created HDR10+ which is a royalty free dynamic HDR format that is an improvement over the HDR10.
The biggest improvement of HDR10+ over the original HDR10 is that it is a dynamic format. When supported content is played over HDR10+ it supports fine-tuned adjustments for color and contrast just like Dolby Vision.
Platforms like Youtube, Paramount+ and Hulu support HDR10+ right now.
HLG
Developed by BBC, Hybrid Log Gamma (HLG) is also another HDR format that is made specifically for broadcasters. Although there aren't many broadcasters that support HDR atm, it's good to know that almost all of the HDR displays support HLG because of it being royalty-free.
What do you need to experience HDR?
To experience proper HDR you need to have three things:
A screen that supports HDR
A media player, a streaming device/console that supports HDR
HDR-supported content
Simply buying an expensive HDR TV won't be of much benefit if the Netflix plan that you have doesn't support HDR or the disc that you're watching isn't HDR compatible. That's why it's necessary that the content you watch, the player that you use and the TV you have support HDR.
For the display panel you should aim for at-least a peak brightness range of 600 nits or upwards. The panel should at-least be 10-bit to support common formats like HDR10.
It's necessary that the media player/console you use specifically supports HDR. In addition high-bandwidth cables also help as long as they are compatible with your devices.
Baseline HDR Requirements
Many believe that a baseline HDR experience is better than an SDR experience. For the average folks here are the baseline spec that they can meet to get a baseline HDR experience:
HDR10
600 nits or above peak brightness
HDR-compatible content (4K UHD Discs, etc)
The specs above are sort of non-negotiable. Of course if you want to go above and beyond you can certainly go for 12-bit formats like Dolby Vision or HDR10+ with displays that have a peak brightness of 2,000 nits and above.
But the thing is that most of the HDR content is compatible with HDR10 which is the baseline spec anyways. And there isn't much utility for higher-end formats like Dolby Vision since 12-bit HDR content isn't out there.
Though, the dynamic metadata and fine-tuning that 12-bit formats like HDR10+ and Dolby Vision support can be appealing for some use-cases. Its utility for an average user is limited.
So, unless you really want to go real hard on your TV, meeting even the baseline for HDR would be a great uplift from the SDR experience.
Content Availability
With the rise of 4K UHD TVs, HDR is getting common and a large number of TVs now support it. Most of the 4K UHD TVs already support the HDR10 minimum threshold.
Other than just the display manufacturers, streaming giants like Netflix and others already offer plans that have HDR support built into them. Most or almost all of the 4K UHD discs support HDR as well.
Newer consoles like the Xbox Series X and Playstation 5 also have support for HDR along with specific games that allow to make use of HDR features. Higher-end smartphones also have HDR support.
If you’ve been scrolling through YouTube or Instagram Shorts/Reels lately on an HDR-capable phone, you might’ve noticed something strange - sudden spikes in brightness that almost feel like you’re being flashbanged.
That’s HDR sneaking into your daily apps. While the automatic brightness jump can be jarring, you can’t deny the clarity difference. Side by side, HDR clips look sharper, richer, and far more lifelike than standard ones, which is why platforms are leaning hard into pushing HDR video now.
Is HDR Worth It?
Generally Yes, if you can afford a good HDR TV and a compatible player, but the answer can change for some people if you get specific. Some people argue that sometimes HDR should be prioritized higher than just the screen resolution
Even the base HDR10 is a big difference when compared to SDR viewing experience. HDR10's significantly higher color range of 1 billion colors, up from 16 million colors of HDR, has a significant difference in the color accuracy of the content.
The wide range of available colors coupled with the peak brightness that starts from the base of 600 nits is also a big difference coming from standard screens that rarely go to 200 or 300 nits of peak brightness.
Here's my general take on HDR for the common use-cases:
Movies & Shows - absolutely yes if the set hits 600 nits+ and the source is 4K HDR.
Gaming - worth it on modern consoles/GPUs if the monitor or TV meets true HDR specs and can keep its latency low.
Work Laptops - questionable unless you’re a creator or the price bump is tiny; limited screen real estate hides the gains.
Most mid-range and flagship phones are HDR anyway so cost premium is negligible but good brightness control matters outdoors. And if you're a professional video creator, I don't need to advise you, you already have the answer :)
Pros & Cons of HDR
For people who are especially linked to content consumption, HDR is a solid leg-up over SDR but there are a few things one should take note of before getting a new HDR display.
There are some fake HDR displays that claim to support HDR but actually don't. To spot them you can use the minimum baseline spec that I shared above, the most important of which is the peak brightness of 600 nits and above.
Many of the Windows users have pointed out the challenges in current implementation of HDR on Windows. The manual HDR switching is a pain for many people, coupled with the less developer support for HDR that makes it not a really pleasant experience on Windows right now.
On Windows, some people have also mentioned SDR desktop bleeding when switching out of HDR content and some have also complained about tone-mapping inconsistencies.
The Future of HDR
4K UHD TVs are the standard nowadays and most if not all of them support HDR. HDR10 being open source means that manufacturers are actually incentivized to implement HDR in their monitors.
With the race to Micro-LED displays and 5,000 nits being pro typed by manufacturers we will continue to see major advancements in the years to come. The premium for today is the baseline for tomorrow.
I also believe that with time operating systems like Windows would also get better with supporting HDR content. New game publishers are also supporting HDR so the future for HDR looks promising.
As many say, after switching from Standard Definition to High Definition, HDR is going to be the next pivotal switch for those who upgrade from SDR.