If you’re like me, you’ve probably gone to your local electronics store and oogled at how beautiful TV’s have become. Many of them can produce absolutely astounding images, with extremely black blacks and colors that will make your eyes pop. While you’re browsing, you come across a number of different specs that you don’t understand, mainly a couple of types of 4k HDR, and 8 bit and 10 bit color. So what do these all mean? Fear not, as we’ve put together this guide just for you.
What is 4k HDR?
HDR stands for High Dynamic Range. This is essentially the difference between the blackest blacks and whitest whites the screen can produce. OLED TV’s are special in this way, as they can physically turn off individual pixels, meaning that blacks on these TV’s are truly black. The whites on the other hand are measured in a unit of brightness called nits. Newer 4k HDR TVs can produce extremely bright images, capable of up to around 4,000 nits, much brighter than the 300-500 nit standard dynamic range televisions.
Related: 4K vs 1080p
There are quite a few 4k HDR standards making their way across the industry, but as of today two major players have shown to come out on top: HDR10 and Dolby Vision.
8 bit, 10 bit, 12 bit.. What do these mean?
If you didn’t know, a bit is a piece of data representing a certain value. These values can be a 1 or a 0, and essentially decide a value that is being represented by a computer.
8 bit color
In TV’s, each individual value represents a specific color in a color space. When we talk about 8 bit color, we are essentially saying that the TV can represent colors from 00000000 to 11111111, a variation of 256 colors per value. Since all TVs can represent red, green, and blue values, 256 variations of each essentially means that the TV can reproduce 256x256x256 colors, or 16,777,216 colors in total. This is considered VGA, and was used for a number of years as the standard for both TVs and monitors.
With the advent of 4K HDR, we can push a lot more light through these TVs than ever before. because of this, it’s necessary for us to start representing more colors, as 256 values for each primary color is not going to reproduce nearly as lifelike images as something like 10 or 12 bit.
10 bit color
10 bit color can represent between 0000000000 to 1111111111 in each of the red, blue, and yellow colors, meaning that one could represent 64x the colors of 8-bit. This can reproduce 1024x1024x1024 = 1,073,741,824 colors, which is an absolutely huge amount more colors than 8 bit. For this reason, many of the gradients in an image will look more more smooth like in the image above, and 10 bit images are quite noticeably better looking than their 8-bit counterparts.
12 bit color
12 bit color ranges from 000000000000 to 111111111111, giving this color scale a range of 4096 versions of each primary color, or 4096x4096x4096 = 68,719,476,736. While this is technically a 64x wider color range than even 10 bit color, a TV would have to be able to produce images bright enough to actually see the color difference between the two.
In addition, the difference between 8 and 10 bit color is quite a bit more noticeable than 10 to 12, but Dolby’s 4,000 nit standard will widen that difference quite a bit.
HDR10 vs HDR1000 vs Dolby Vision.. What’s the difference?
HDR10 is not technically as advanced as Dolby vision, having slightly lower standards and minimum requirements compared to its competitor. However, HDR10 is an open standard, meaning TV manufacturers can utilize the technology without having to pay Dolby royalties.
HDR10 aims to produce 1,000 nits of brightness as a peak target, but the spec actually caps out at around 4,000. It reproduces 10-bit color, guaranteeing that you’l be able to achieve over 1 billion colors per pixel. This is the most popular standard, and will likely be shipped with wider range and lower cost HDR TVs.
HDR1000 is a standard set up by Samsung to ensure a peak brightness of 1,000 nits. This standard is often mixed in with the term SUHD, which stands for “Smart Ultra High Definition”. Samsung says this standard also uses a special technology called Ultra Black which reduces glare from lights and the sun on your television set, so this standard may be worth looking into if you have glare issues.
This standard is obviously only available on Samsung TVs, which usually range from the mid-high end of the market. It holds essentially the same specification as HDR10, but Samsung threw in that anti-glare technology to separate itself from the pack. This often also goes by the name of HDR10+.
While Dolby Vision technically launched before the HDR10 standard, it’s certainly not as popular. Since Dolby owns this standard, manufacturers have to pay the company to put out TV’s which use it, a turn off for many attempting to appeal to the general consumer market.
Dolby Vision uses 12 bit color, giving a range technically 64x as wide as 10 bit. Dolby vision aims to reproduce 4,000 nits as a target and caps out at 10,000. 4,000 nits is an extremely high amount of light, and there are few TVs that can actually produce this even today. However, it is important that TVs actually hit this standard so that users can discern between the 64x wider color gamut. With pricing how it currently is however, we’d be hard pressed to go for Dolby over HDR10.
Other HDR profiles
There are a few other HDR profiles floating around. Perhaps the most popular of which is called Hybrid Log Gamma or HLG. The BBC and other broadcasters from the UK and Japan want to use the technology for standard broadcast TV. HLG has HDR much like HDR10. However, it allows broadcasters to transmit it and the SDR signal all at once. Thus, it becomes easier for broadcast TV to upgrade its programming for HDR-HLG compatible televisions while also still being able to show SDR content on normal TVs.
Advanced HDR is another tech meant mostly for broadcast television. According to The Verge, it works around upscaling SDR to HDR. There isn’t a ton of information about this one yet, so keep your eye out for more!
Do you need 10 bit or 12 bit HDR?
Currently, live television does not support 10 bit color. Getting a 10 bit HDR TV will not magically allow your standard content to become HDR 10 bit or 12 bit capable. Some services like Netflix do offer 10 bit streaming services, but you’ll have to pay a bit more in order to actually watch supported content. In fact, Blu-Ray disks only use 8 bit color, so your growing collection isn’t going to magically look better on that new TV of yours, though the image may technically be brighter.
What should you look out for?
Many TV manufacturers market their TV’s as being 4k UHD, but this can be very confusing for a number of reasons. While the panel itself may in fact be 4k, this specification has nothing to do with the panel’s HDR capability. It is quite possible that a 4k TV does not have true HDR compatibility at all, and even if it does, you need to make sure the panel is rated to process the signal.
Here’s a couple of things to specifically look out for:
Some manufacturers will label their televisions as HDR even if they only support 8-bit color. This is because there are 2 different specifications that can classify a TV as having HDR compatibility: contrast and color depth.
Contrast is the specification we’re looking at here. Contrast is the difference between the blackest black a panel can produce and the whitest white. Technically, the more nits produced by a panel the whiter is can be, so often times manufacturers will meet a certain contrast level and stick the ‘HDR’ sticker on their TV. While this is not going to look as good as something with a 10-bit panel, it will still look quite a bit better than your standard ‘non-HDR’ 8-bit TV.
The Rec 2020 color space
The Rec 2020 color space is a range of color. It was defined in 2012 as a standard for bit depth of 10 or 12 bits for 4k and 8k TVs. This is the range supported by the television’s processor, and not necessarily the panel itself. Some manufacturers will produce televisions with 10 or 12 bit panels that are not able to actually process the 2020 color space, leading to an image that is not actually 10 bit. While the contrast may be bright enough to register as HDR and make the image look better, it is still only going to process the colors supported by an older color space.
The Samsung KU6300 (above) is one of these TV’s, so make sure you look for supported color spaces before making a purchasing decision.
We hope this guide helped you understand the differences between all the bits HDR has to offer. It’s a confusing subject for a lot of consumers, and it can often be difficult to decide which standard is right for you. If you’re just looking for a decent HDR TV, HDR10 is probably perfectly fine for your needs. If you have to have the absolute best there is however, Dolby Vision is what you’re going to want to aim for. This is emerging technology. Thus, there still isn’t a lot of HDR10 or 12 content out there, yet.