Blog Categories

UHD 101: Demystifying 4K, UHD Blu Ray, wide color gamut, HDR, 4:4:4, 18Gbps and the rest!

This comprehensive blog article is intended to demystify some of the technical details behind UHD or Ultra High Definition.

UHD is the next generation of video formats, and has many potentially picture enhancing features over our current HD format.  As we’ll explain, 4K resolution is only part of the UHD concept. There are other important components such as wider color gamuts (more colors), higher color bit depths (smoother color gradations) and brighter image highlights (HDR).

Here are the items covered in our introductory section. Read this if you want the high level details:

  • What UHD Sources are there?
  • What is the Ultra HD Blu Ray specification?
  • What capabilities does my display need to have to enjoy UHD content?
  • Does my AVR / Pre-Pro need to be HDMI2.0 / HDCP2.2?

Then we get more technical, and explain some of the aspects of UHD:

  • 4K resolution
  • Wide color gamut
  • Color bit depth
  • Color subsampling
  • HDMI versions
  • HDCP
  • HDMI data rates
  • HDMI cables
  • Display certification standards
  • HDR

What UHD Sources are there?

At this point a few hardware options:

  • UHD Blu Ray players, such as the new Samsung UBD-K8500. You can buy discs for around $30 from Amazon or Best Buy right now in the US!
  • Kaleidescape Strato – these players can download UHD content from the Kaleidescape store. The movies are created directly from master files received from the content creators and are not simple copies of UHD Blu Ray discs…in fact there is more UHD content on the store than there is currently UHD Blu Rays! The Strato supports HDR10 but not Dolby Vision, as well as the lossless audio codecs and Dolby Atmos.
  • Sony FMP-X10 player, which we believe is end-of-life and does not support HDR.

Kaleidescape Strato

Then there are the streaming services, which can be accessed from many hardware devices such as the those built into your TV, Roku 4K player, Amazon FireTV2, Nvidia Shield, etc.

  • Vudu
  • Netflix,
  • Amazon
  • Ultraflix
  • M-GO
  • YouTube
  • Sony’s forthcoming “Ultra” service

For a comprehensive list of sources and available content, see the “Master List of 4K“… thread over at

Note that most of the streaming services have compressed audio soundtracks, even the ones advertising Dolby Atmos like Vudu. Since high performance home theater is about video and audio, that only leaves two sources of note: UHD Blu Ray and Kaleidescape’s Store. For more read My Search for Higher-Quality Soundtracks in Streaming Movies.

What is the Ultra HD Blu Ray specification?

The Ultra HD Blu Ray spec is as follows (see the rest of this article for details on how to “decipher” these acronyms):

  • Up to 4K resolution
  • 4:2:0 color sub-sampling
  • Up to 10 bit color
  • Up to 60 frames per second
  • Support for wide color gamuts (REC.2020)
  • Support for HDR10 and Dolby Vision
  • No 3D support
  • HDCP2.2

Ultra HD Blu Ray

As many of these specifications are optional it seems that just because a disc is labeled Ultra HD Blu Ray does not mean it will have HDR, a wide color gamut or 10 bit color. Hopefully UHD Blu Ray players will have some kind of signal information menu to reveal what is actually on the disc.

Note also that many of the movies announced for release on UHD Blu Ray so far were actually not shot at 4K resolution! See this article for some insightful analysis.

What capabilities does my display need to enjoy UHD content?

  • HDMI2.0 / HDCP2.2. First and foremost you need a display capable of receiving a HDMI2.0 / HDCP2.2 signal. Unfortunately for consumers some early TVs sold as 4K capable do not have the required HDMI2.0 / HDCP2.2 inputs!
  • 4K resolution. UHD is 4x the resolution of HDTV, typically 3840×2160 (2160p), although the Sony PJs have slightly larger, since they use imaging chips derived from their cinema projectors.

Whilst not strictly necessary for UHD your display may also have:

  • Wide color gamut (WCG).  HDTV and Blu-Rays are mastered and distributed in the REC.709 color gamut. With UHD content can be distributed in either the REC.709 or REC.2020 color gamuts. If REC.2020 is used that does not necessarily mean the content has been created using such a wide color gamut. We expect most of the “wide color gamut” content for the next few years to actually have a color gamut similar to the commercial cinema P3 space. REC.2020 can, in this instance, be considered as the container, rather than necessarily the actual color gamut of the encoded content. With this being said, your display must accept a REC.2020 signal to display WCG content. Apparently some displays released to date such as the Sony VPL-VW1100ES do not accept REC.2020 even though they are marketed as being able to reproduce a color gamut approaching P3. Confusing, we know, but that is the state of things with respect to 4K / UHD in 2016!
  • High dynamic range (HDR). This is the ability to display images with highlights that are much brighter than today. These highlights are used for things like reflections off cars. Content has to be specifically mastered for HDR, so not all UHD content will necessarily be HDR. There are also at least three competing standards for HDR, including HDR10 and Dolby Vision and no broadcast standard. There’s also a new gamma curve (more correctly an electro-optical transfer function or EOTF), SMPTE ST2084. Whilst the UHD Alliance has mandated brightness standards for TVs there are no such standards for projectors. In addition whilst there is a mastering standard for HDR (1000 nit for HDR10) there is no standard or accepted practice for how to scale to the output levels that your display is capable of. Suffice to say it is early days for HDR!

Things there are confusion over with respect to UHD displays:

  • 18Gbps. This may be required if content makers start releasing movies with high frame rates. 4K @ 60 frames per second may require 18Gbps data rates if it is at bit depths of 10 or 12 bits. Otherwise 10.2Gbps is fine, even for 4K60 at 8 bit!
  • 12 bit color. This refers to the color bit-depth. Old Blu-Ray is 8 bit, new UHD Blu Ray can support 10 bit. It’s not clear what bit depth is being used by the streaming services, but it’s probably 8 bit.
  • 4:4:4.  This refers to color sub-sampling. Whilst 4:4:4 is used in content mastering, UHD is distributed via 4:2:0.
  • 3D. There is no UHD 3D at this time, at least via Blu Ray.

Does my AVR / Pre-Pro need to be HDMI2.0 / HDCP2.2?

If you want to use your AVR or Pre-Pro to switch UHD sources then it needs to have HDMI2.0 (2.0a for HDR)/ HDCP2.2.

If your AVR/Pre-Pro does not have this then the workaround is to run one HDMI cable from the source to the display and another from the source to the AVR.  Both the Samsung UHD and Kaleidescape Strato players provide this functionality.


Getting Technical

This section covers most of the background behind the UHD specifications. We’ve tried to make it comprehensive, but beware that this means quite a few technical details! Where relevant we’ve linked to places where you can do further reading.

If you find any technical inaccuracies or things that require further clarification please leave a comment. It’s likely there are a few!

4K Resolution

Resolution refers to the number of pixels displayed. Our current HD standard is 1920 horizontal and 1080 vertical. The new UHD standard is four times this amount, or 3840 horizontal x 2160 vertical.

UHD resolution
Image credit: unknown.

Confusingly commercial cinema 4K as specified by the Digital Cinema Initiative (DCI) and VESA (the standard used by still and video cameras) is 4096 x 2160, which is slightly wider than Quad HD / UHD at 3840 x 2160.

Wide Color Gamut (REC.709, P3, REC.2020)

Nearly all current consumer content, whether HDTV over cable or satellite, DVD or Blu Ray is created and distributed in the REC.709 color gamut.

The issue with this is that it only represents a small portion of the visible spectrum of colors. The standard used in commercial cinema has more colors and is called DCI P3. There is also an even wider color gamut called REC.2020 which is the “wide color gamut” standard used in UHD.

Note that just because content is labeled as UHD does not mean it has a wide color gamut. Much of the initial UHD content actually has the same REC.709 color gamut that we have today.

UHD Color Gamuts
Image credit: Spectracal

WCG content comes in a REC.2020 container. As we mentioned earlier not every display can accept REC.2020. If your display cannot accept REC.2020 then the source should “downconvert” to the REC.709 space. The new Samsung UHD Blu-Ray player does this.

There is some confusion over the P3 color space – whist the wide color gamut content actually encoded onto the disc may have a color gamut close or equivalent to the P3 space in UHD it is transmitted inside a REC.2020 container. There is no consumer P3 color space. The confusion arises because display manufacturers are now advertising % of P3, and this is also a specification that has been codified in the Ultra HD Premium TV certification. We guess the assumption being made is that despite the container being REC.2020 what we will see for the forseeable future is that the actual colors in the content are equivalent to the P3 space.

Color Bit Depth (8 bit, 10 bit, 12 bit)

Bit depth describes the number of potential values that the encoding of color information in a video signal can have.

Historically, Blu Ray has been 8 bit, which means 256 possible values for red, green and blue. UHD Blu Ray is 10 bit, giving 1024 values for RGB. 12 bit color provides 4096 values for RGB.

One important reason that we have moved to a 10 bit system for UHD Blu Ray is to reduce color banding. This is an image defect where bands of color are visible. It’s more important in the UHD world because of the expanded color space and hence the greater color variations.

8 bit vs 10 bit color
Image courtesy

HDMI2.0 supports 8, 10 and 12 bit color in various formats, as covered in the following sections.

Sometimes you will see references to 24, 30 and 36 bit color. These references relate to the total color bit depth for all RGB channels, where 24 bit = 8 bit red, 8 bit green, 8 bit blue and 36 bit = 12 bit red, 12 bit green, 12 bit blue.

Color Subsampling (4:2:0, 4:2:2, 4:4:4)

Consumer video is stored, transmitted, and processed in a color space called Y’CbCr. The three components stand for:

  • Y’ = Luminance, or Luma, representing the brightness of the pixel
  • Cb = Blue color difference
  • Cr = Red color difference

This standard was defined back at the start of the color TV era as a way of including color information in the broadcast signal.

Image credit: unknown.

Encoding in the Y’CbCr color space allows the resolution of the color channels Cb and Cr to be reduced through color or chroma subsampling. This technique takes advantage of the fact that human vision is more sensitive to light differences than to color differences.

There are three main types of color subsampling used today. These are 4:4:4, 4:2:2 and 4:2:0.

Color subsampling
Image credit:

If 4:4:4 is a full bandwidth signal, then 4:2:2 occupies 2/3rds the space and 4:2:0 occupies 1/2 the space. Blu-ray and UHD Blu-ray both store the video signal in the 4:2:0 format. This essentially means that each pixel has a Y’ signal, odd pixel lines have no Cb or Cr and then Cb and Cr are alternated on each even pixel line.

To get the video displayed on the TV or Projector it typically goes through the following conversions:

  • 4:2:0 to 4:2:2
  • 4:2:2 to 4:4:4
  • Y’CbCr to RGB

Historically the source upsamples to 4:2:2, which is sent over HDMI, and then the display upsamples to 4:4:4 and converts to RGB.  The reason for this sequence is that HDMI v1.4 and previous iterations did not support 4:2:0. HDMI2.0 does support 4:2:0 though only at 50/60 frames per second (FPS). At 24 FPS 10 bit only 4:4:4 and RGB are supported.

Formats supported by HDMI v2.0. Bold text indicates new formats supported by 2.0 but not by 1.4. Source: HDMI Alliance

Note that there appears to be some confusion about exactly what is supported in the HDMI specification, even among manufacturers and industry participants. Some people we discussed this article said 4:2:0 was supported at 24/25/30, others said that as of 2.0a 4:2:2 was supported at 10 bit. In the absence of a clear industry wide understanding we will stick to the information published on the HDMI website.

There is no intrinsic benefit to the source upsampling to 4:4:4 or converting to RGB. With respect to UHD and HDMI it is actually beneficial if the conversion to 4:2:2, 4:4:4 and RGB is, as much as possible, left to the display, as this reduces the HDMI bandwidth requirements.

Further reading:

  • Choosing a color space, by Spears & Munsil (note that some of the information for HDMI is out of date, as it was written for 1.4)

HDMI Handshaking

Placeholder for future section on HDMI handshaking between source and display and how this impacts data rates, color bit depth, color gamut and color space…

HDMI Versions (1.4, 2.0 and beyond)

There have been essentially three types of of “4K capable” HDMI chipsets on the market. These have been implemented into various TVs, projectors, processors, AVRs and switchers since the 4K came onto the market in 2013.

  • HDMI 1.4 chipsets supported data rates up to 10.2Gbps. This means they could do 4K at up to 30 frames per second at the formats supported by the 1.4 standard (8 bit RGB or 4:4:4 and 12 bit 4:2:2). Whilst 1.4 is therefore theoretically 4K capable nearly all components that have it lack HDCP2.2, which is the copy protection scheme the industry has settled on for UHD.
  • We then saw HDMI 2.0 chipsets that supported the new formats in the 2.0 standard but were still limited to data rates of 10.2Gbps. These new chipsets have HDCP2.2 so they can display some but potentially not all UHD content.
  • Finally at the end of 2015 we started to see 18Gbps chipsets.

We’ve summarized this information together with the formats and bandwidth requirements in the table below:

HDMI formats, versions and bandwidth requirements
HDMI formats, versions and bandwidth requirements

This is important information, as we are seeing some sources such as the Samsung UHD Blu-Ray player that only support 4:4:4  or RGB at 10 bit, both of which require the whole video chain to be 18Gbps capable. They could easily have used 12 bit at 4:2:2:, which would have enabled compatibility with 10.2Gbps chipsets.

HDCP (1.4, 2.2)

HDCP stands for High-Bandwidth Digital Content Protection. The version used in UHD is 2.2.

HDMI data rates (10.2Gbps, 18 Gbps)

Part of the deal with UHD is the potential requirement of using HDMI chipsets and HDMI cables that support “18Gbps” data rates.

The table below summarizes the data rates for the different frame rates and formats that are part of the HDMI2.0 specification.

HDMI data rates
Data from Extron Electronics Data Rate Calculator

As you can see it is possible that a 10.2Gbps chipset and cable infrastructure can support UHD Blu Ray, assuming the transfer medium is 4:2:2 at 12 bit.

HDMI Cables

For short runs (say up to 6′ / 2m) most passive cables will be able to support the data rates required for UHD. Between 6′ and 15′ you’ll find some passive cables that can support UHD and others that can’t. Above 15′ you’ll very likely need an active cable.

HDMI Premium Certification

There are a few independent companies that are providing HDMI cable certification for UHD data rates. These include:

  • HDMI, “Premium Certified Cable”, at the 18Gbps rate
  • UL Lab, “High Speed 4K Cable”, at the 10.2 and 18 Gbps rates
  • THX
  • DPL Labs, “DPL Seal of Approval”, at the 10.2 and 18 Gbps rates

HDBaseT (HDMI over category cable) is capable of data rates up to 10.2Gbps.


Display Certification Standards

Ultra HD Premium

The main display certification standard is one created by the UHD Alliance. This is an industry consortium of content creators (e.g. Hollywood Studios), distributors (e.g. Amazon, DirecTV) and hardware manufacturers (e.g. LG, TCL). They have created a set of specifications and a logo to help consumers.

Ultra HD Premium

It’s not clear at present if this “badge” will just get applied to consumer displays, or if it’s intended to be used across the content creation and distribution ecosystem as well.

To get a Ultra HD Premium “sticker” a display must meet a set of criteria, these are:

  • Image Resolution: 3840×2160
  • Color Bit Depth: 10-bit signal
  • Color Palette (Wide Color Gamut)
    • Signal Input: BT.2020 color representation
    • Display Reproduction: More than 90% of P3 colors
  • High Dynamic Range
    • SMPTE ST2084 EOTF
    • A combination of peak brightness and black level either:
      • More than 1000 nits peak brightness and less than 0.05 nits black level
      • More than 540 nits peak brightness and less than 0.0005 nits black level (note the interesting “fudge” here, clearly something included in the spec for low light output OLED TVs…)



Interestingly, Sony, despite being a member of the UHD Premium alliance, has a different “sticker” that it is putting on it’s displays. We think they are doing this because they also have projectors, and to have a consistent “sticker” they need their own standard, since the projectors can’t hit the peak brightness / black level standards required for the UHD Premium “sticker”.

Sony 4K HDR Logo

High Dynamic Range (HDR) Standards

There are multiple HDR standards at this point, and it is not clear which one will become dominant in the market. HDR10 and Dolby Vision appear to be the front runners, but there are others lurking in the wings such as Hybrid Log Gamma (HLG).

It’s quite the “evolving ecosystem” at this point. Even if you buy a display with both HDR10 and Dolby Vision (which limits you to flat panel TVs), the amount of light output the TV can put out will likely be on an upward trajectory for the next few years. The TVs you can buy now are limited to about 1,000 nits, but the Dolby Vision standard can see a future with 10,000 nit displays!

This section covers the two main standards at a high level. We’d also recommend reading the State of HDR article if you really want the technical details (warning, it’s even more technical than what we’ve written).

Be aware that there are a number of articles out there about HDR, many of which were written in 2015 before the slightly clearer picture that emerged late in 2015 after the publication of the SMPTE HDR Imaging Ecosystem report. As such you’ll likely find conflicting and incomplete information. It’s likely that some of the things we have written are incorrect or incomplete too, so please leave a comment if you find things we should update or clarify!


Further reading:


HDR10 is an open standard.

  • Appears to be fewer standards around production and playback than Dolby Vision. For example there is no dynamic metadata like with the Dolby Vision standard that maps high brightness areas of the image to the display’s capabilities. It is left to each TV manufacturer to work out how to do that.
  • Content typically mastered to 1,000 nits and 10 bit color.
  • Requires HDMI2.0a.
  • It is mandatory that UHD HDR Blu-Rays include HDR10 metadata, even if they also have Dolby Vision. This requirement does not extend to other sources of HDR content, such as VUDU, which we don’t believe has HDR10 metadata.
  • In many 2016 TVs and some projectors support HDR10 – see this HDR-Capable Display list for a comprehensive and updated list.

Dolby Vision

Dolby Vision is one of the competing “standards” for HDR. More details can be found on the Dolby Vision page and in the Dolby Vision whitepaper.

Dolby Vision Logo

  • Specifies that the goal is for the cinematic master be done in 12 bit color and with 10,000 nit brightness. Note that there are no displays that can do 10k nits, so presently content is being mastered to 4k nits, primarily using the Dolby PRM-4220 monitor. It also seems like 12 bit is not a necessity for Dolby Vision, and 10 bit may be recommended for broadcast.
  • The content creator can specify in metadata how to display the specular highlights. For example they can create a 4,000 nit master and then also create “variants” for displays that can only do 1,000 nits. Apparently this is an improvement over HDR10, which leaves it up to the TV to figure out how to map high brightness content to the display’s capabilities.
  • Requires dedicated silicon, and therefore if a display does not have it on launch it is not going to get it via a firmware update.
  • Does not require HDMI2.0a. The Vizio Reference TVs do not have 2.0a.
  • The streaming service VUDU UHD is using Dolby Vision, as are Netflix for some of their original content shows (Marco Polo).
  • It does not seem that any UHD Blu-Ray players have Dolby Vision at this time, and therefore there are also no Dolby Vision Blu-Ray discs…please correct us if we are wrong.
  • Can be found in TVs from LG (ALL 2016 OLEDs – B, C, E and G Series), Vizio (Reference Series) and TCL
  • No consumer projectors have Dolby Vision.

Vizio Dolby Vision



51 thoughts on “UHD 101: Demystifying 4K, UHD Blu Ray, wide color gamut, HDR, 4:4:4, 18Gbps and the rest!”

  1. I want to thank you for that primer. It is very informative. I do have some questions though,
    I have a 2015 Samsung UN55JS8500. I know it doesn’t meet the newest standards but I think it meets some and is close on others. For ex. It reaches 90% of DCI P3 and it reaches just over 68% of Rec 2020 standards. It can also do HDR10. So I think it can do some form of HDR playback. I also think it’s a 10 bit display.I also think it can display WCG.

    Plus I plan on having it calibrated. My friend who has someone who calibrated
    his projector along with many other TV’s and projectors. He uses a Sencore OTC1000 with the standard software for it. I asked him what standard he was going to use to calibrate my TV. This is what he said “I will be calibrating your display to to a reference white balance of D65 and REC709 color gamut against the CIE1931 color space and normalize the gamma curve to 2.2.” The software he will be using looks pretty extensive. It can do RGB stuff, check for contrast ratio along with a bunch of other stuff. I think he isusing the ColorPro 6000 software. It looks pretty thorough to me. Now my question is this. If he calibrates
    to the Rec 709 standard, what happens if I get a 4k player for my TV. That color palate includes the 709 plus a lot more. What happens to the colors outside the 709 limit. Will they just not be calibrated? Will some colors be calibrated and others not?
    Thanks for any reply.

    1. Thanks for posting!

      4K does not necessarily have a wider color gamut than 1080p. Much of the current “1st generation” UHD is actually REC.709.

      To calibrate for wider color gamut you need the software and hardware to do it. I believe we will need to do two calibrations going forward for this, one for REC.709 and one for REC.2020 (the wider color gamut container that UHD can come in).

      1. Thanks for the reply. So for a while at least, I should be good. Even if and when I have the ability to play back 4k content? So at some point in the future, once more content is available or that container is more prevalent, I will need to have a calibration for
        Rec 2020? Will that over right the 709 calibration or will it keep that calibration but add the 2020 on top of that. Sort of like the color triangle. The 709 triangle will stay as it is but the new calibration will just affect colors out side that triangle. And since you indicate that most of the early UHD content will be 709, then I should be fully calibrated if I get a 4k player and disks?

        1. I think the answer to that question is “it depends” on how the display manufacturer implements calibrations.

          If it’s anything like today I think the likelihood is that you will need two separate calibrations. I’m not sure how display manufacturers are implementing it but I suspect there are two color gamut “options” (REC.709 and REC.2020). The display will likely automatically swap between the two depending on what UHD content it is being fed. I suspect that each color gamut will have it’s own color management system controls.

          Based on this reasoning your display will not be calibrated if you get UHD content with HDR or wide color gamut content.

          1. So, just as an example, are you saying that if I get a UBD player, connect it to my TV and watch a 4k movie… Like Mad Max Fury road, my TV won’t act like it’s calibrated at all? It will just default to whatever the standard settings are for the TV? To further that, if I calibrated my TV in “movie” mode, if I watch a BR I will get the calibrated version of “movie” mode but if I watch a 4k UBD, I will get the standard “movie” mode settings? Is that what you are saying?

            I mean, if I am honest, I watched some 4k clips from the YouTube app on my TV to make sure it was really 4k and as is, it looked REALLY good. So I think I can be happy with that. But thanks for letting me know. That is some good info that I did not know.

          2. I’d suspect that the answer to your question will vary from manufacturer to manufacturer.

            I’d also suspect, not having the hardware you have in front of me, that if you feed the TV a WCG/HDR signal that it’s going to automatically switch some of the picture settings. It will likely switch color gamut and gamma…but might also switch to a different “profile” completely, with different brightness, contrast, greyscale settings.

            You would be able to see for yourself what happens, by going into the picture menu for different sources and seeing if anything changes.

  2. Quad HD is not UHD(you could call it quad full hd , but no one does that) QHD is actualy 4x720p(2xHres+2xVres)=1440p.

  3. This is a very good article.

    I calibrate audio and video on a regular basis and have worked with many of the new displays. Each color white point, gamut and EOTF you plan to use requires a separate calibration. Many of these new displays can support DCI P3, BT2020 and Rec. 709 color spaces. This should be triggered automatically with HDMI 2.0a. Display EOTF can also be toggled between BT1886 and ST2084.

    Displays that toggle automatically tend to have two different memories similar to what happens with 3D. One is for BT1886 and the other for ST2084. Within each of these they will use D65 for the white balance and any of the three color spaces. It appears to me that sources today are mostly supporting Rec 709 with a BT1886 EOTF and D65 white point or BT2020 with a D65 white point and ST2084 EOTF.

    I personally use a video generator along with an UHD Blu-Ray player to generate the necessary test patterns to be able to calibrate a memory for HDR and another for Rec. 709. The video generator enables much faster calibration for both Rec. 709 and HDR. The UHD Blu-Ray player allows one to proof the calibration with actual images.

    1. Thanks for reading Jeff and leaving a great comment. Indeed display calibration is more complex now with UHD sources, if you want 3D then you pretty much have to do three separate calibrations (HD, HD 3D, UHD). We’re using a Murideo SIX-G for generating the required patterns, what are you using?

  4. Very nice article, thank-you! Particularly for the “HDMI formats, versions and bandwidth requirements” and “HDMI data rates” table. The fact about the Samsung UHD player putting a burden on the cable you require, instead of using 12-bit 4:2:2, was a real eye-opener. I expect they were worried about displays not supporting 12-bit 4:2:2 (even if only 10 bits were used)?

    Please would you consider updating the article to include Dynamic HDR10, which will surely level the playing field more between HDR10 and DV.

  5. There seems to be some conflicting info on what resolutions, refresh rates, bit depth, and chroma subsampling methods are supported by HDMI 2.0. One chart says 4k/60 10bit 4:2:2 is not supported by the HDMI spec and another chart says it’s supported by 1.4 and could be used. I can run 4k/60 10bit 4:2:2 from my PC to my TV which according to one chart shouldn’t be possible. The GPU is using HDMI 2.0 ports.

    1. Hi Eric,

      I agree. It would be great if the HDMI specification were open source and then we could go straight to the “horse’s mouth”. Unfortunately it’s not, and so we are reliant on third party information 🙁


        1. i can. on Nvidia cards from 2014 on (Maxwell and Pascal, aka 980 and 1080, etc) via HDMI 2.0a. I can get RGB 8-bit, 4:4:4 8-bit, and 4:2:2 8,10, or 12 bit in 4096 x 2160 via the Nvidia control panel. I have the Samsung HU8550 for 2014, but, it is a 10-bit panel natively. I think it accepts 12 bit but is really 10-bit plus dithering to make it 12 bit, but I’m no expert.

  6. Your “further reading” links dont work, i was curious to read those articles as HLG was implemented in HDMI 2.0b a few days ago.

    1. Which links are broken? I can see if I can fix them. That’s the problem with linking out to the web…stuff moves around!

  7. Great Info.

    If you wouldn’t mind, a bit confused on one aspect. SO, if we are at 16bit color sampling, is that: 65536 (red) * 65536 (green) *65536 (blue) = 281,474,976,710,656 different possible colors? per pixel?

    1. Normally when we are discussing “bit depth” we are talking about potential values for the luminance value. The luminance value does not contain any color information.

  8. Hy, thank you for all the info that are very helpfull but I am still a little confused. My onkyo AVR supports 4:4:4 and 4k@60 but I am not sure does that automaticaly means that it supports HDR pass . Can someone clarify please ? My reciever is onkyo ht-r494. I am not sure if it can pass HDR video to my tv if I plug lets say a Nvidia shield tv to it. Thank you in advance.
    here is the manual :

  9. HI Nice explanation.
    I have an LG Oled B6 being fed with a nVidia GTX960 and I have enabled “Deep Color” on the TV.
    If I look at the control panel for the nVidia card, I see I can set a variety of options. For example I set 4:4:4 but I could of set RGB for example. I can also set 4:2:0 and then different bit depths.
    Now what happens when you are watching a 4K video? Are the appropriate settings made by the source? In other words, if I set 4:4:4 (which is 8 bit) and a 10bit HRD video is being played does it change automatically to accommodate the source?
    I haven’t seen any explanation of this.
    Thanks, Tom

    1. Do you need to enable deep color on the TV to get HDR?

      UHD content is 4:2:0 at 10 bits. If you have any higher resolution settings such as 4:4:4 set, or RGB in your display configuration then typical process is for the source to do the conversion from the content’s native color sampling rates.

      I am not sure if your computer changes resolution settings when you are viewing UHD content? Maybe your TV or computer has a way to see what is being sent across the interface?

  10. I have a panasonic ub900 set to 12bit 444, is it better to set it to 10 bit because of colour banding, I have a 4k sony xbr tv. Just wondering the tv has a 10 bit processer

    1. I suspect 4:2:2 at 10 bit would be the best option if that is available, but you might also experiment with 4:4:4 at 10 bit if that’s an option. I’m not familiar with the Panasonic but there’s no point upscaling to 12 bit (if that’s what it does, rather than just bit padding) and then throwing away that information on the TV end.

    1. Sounds like a HDMI issue, maybe try a different cable. If it’s a long cable (>6ft) make sure it’s 18Gbps rated.

  11. Hate to break it to you, Tom, but your Geforce card cannot output 10-bit unless it’s in games. Everything else is 8-bit. You need a Quadro card to do 10-bit content creation, and then it’s only really supported in Adobe. You must have Quadro drivers, enable 10bpc in the control panel, and 30 bit display in Adobe. And Nvidia will never change that. Your cards are for games, and they intend to keep it that way and make you buy the more expensive Quadro to see 10-bit colors.

    So buying an HDR monitor for your PC is pointless unless you have a Quadro card. And I bet there will be 0 HDR players for the PC, because you the card makers refuse to let you display 10-bit on the desktop unless you are running their pro cards. And that also means you are not going to see HDR from online services. Your browser can’t support it anyway.

    1. Thats an interesting point.
      As i understand it, buying an hdr monitor for your pc would involve primarily establishing if it can give chroma 4 4 4 4k at 60hz. a proper 4k hdr will do this in 8bit.
      10 bit 4 2 2 by all accounts is also pretty crisp and readable.
      Then its a case of chceking input lag.
      Hdr content would be handled by a separate UHD bluray player as i dont think a pc drive is yet available?
      therefore the quadro card issue would only affect streamed hdr content? all other pc use including SDR streamed content would be unaffected by this and a 10 series card would be fine? thats my understanding of it and happy to learn more, s ive just bought a feccin 1080 use power such a screen, as yet unbought!

      1. Your 10-bit monitor is probably a scam 8+FRC panel. True 10-bit costs money. There are no video players that will let you watch 10-bit HDR on your desktop. NVIDIA is keeping it that way, which is why I said there will probably be no HDR players for the PC. They assume you know this and are a professional or business and can afford a professional card to edit 10-bit.

        All the players I have used for h.265 were junk except MPC with madVR. All the HDR vids I have found were 4:2:0.

        And then there are the “standards” for HDR. New panels hit 1000 nits, but they have 2000 and 4000 nits that are not for the public. And 12,000 nits? for Dolby. Tehn the DCI-P3 issue. Dell and Acer 97.7% and 95%. You can always buy a Dolby 4220 12-bit for $40,000.

  12. Great article overall, just a few corrections:
    1. Dolby Vision can be added in a firmware update:
    2. Dolby Vision is available on upcoming UHD 4K Blu-ray releases but no matter what all UHD 4K Blu-rays must have HDR10 even if they also have Dolby Vision on them:
    3. Multiple UHD 4K Blu-ray players are being released with the capabilities of playing back Dolby Vision discs and other current player are releasing firmware updates to include this feature:
    4. All VIZIO TV’s from the M and P and up to the reference series dating back to early 2016 have Dolby Vision capabilities as they were the first to launch the technology on a home television in their reference series in 2015:

    1. Thanks Robert! IT’s on my list to update this article, since it’s over 1 year old. Particularly the section on HDR as things are a lot clearer now than they were back then. Now just need to find the time!

      1. Would absolutely love to see and read an update to this absolutely fantastic and foundational reference article. Adding the appleTV to this discussion and its delivery of 4k HDR10 / Dolby Vision would be exquisite.

        1. It’s definitely on the list to update this article. The situation on HDR is much clearer than it was 1 year ago, for example. I was actually planning to pull the section on UHD sources, which would include any reference to the Apple TV, as there are plenty of other great places online where you can find that information. That would leave this article more of a technical one.

  13. Hi Nyal,

    Despite the lack of updates over the past year, this is still a great resource! Thank you for putting it together.

    I am slightly confused about the idea Color Space and Color Depth output settings for an Oppo 203 for example:

    In the article, there is a bit that reads, “There is no intrinsic benefit to the source upsampling to 4:4:4 or converting to RGB. With respect to UHD and HDMI it is actually beneficial if the conversion to 4:2:2, 4:4:4 and RGB is, as much as possible, left to the display, as this reduces the HDMI bandwidth requirements.”

    However, in one of the comments above, you mention that “I suspect 4:2:2 at 10 bit would be the best option if that is available, but you might also experiment with 4:4:4 at 10 bit if that’s an option.”

    So, which combo of Color Space and Color Depth output settings from the Oppo 203 is best from a technical standpoint if I’m using it with an LG C7? The Oppo does have the 4:2:0 output option as well, which I am sure you’ve encountered by now. Thanks for any insight you can provide.

    1. 4:2:0 would be fine as an output from the Oppo if it has that option – I was not aware that was supported in the HDMI specifications. There is no color information beyond 4:2:0 on Blu-Ray or UHD Blu-Ray so it’s “made up” by the display upon conversion to RGB. I’m unsure if there is any difference in quality where the “chroma-upsampling” (as it is called) occurs – whether in display or source.

  14. Great article! I have finally some answers to the question that I had: “Is 18 Gbps enough to carry 4K 60fps 4:4:4 with a 10-bit color depth?.

    Unfortu nately, if I read you well, the answer is no. I must say it is somewhat disppointing. I would have thought 2017 TVs would be available to display these specifications as they are the one required when used with a computer. It seems also that the screen is capable but it’s the cable technology that’s lacking (I assume displayport can achieve what I described). I bought a high end Sony and they could have included a displayport connection but I assume they were too lazy to do that. And It feels like no manufacturer went the extra mile. Even if you buy a $20,000 77” LG W8…

    I find it very difficult (except with your article) to realize that TVs have this limitation. No wonder they are not advertising it loudly. I was surprised the apple TV 4K would only go up to 4:2:2 but now, I know why! Thanks again.

    1. You realize no big brand TV has displayport. HDMI has paid huge money over the last 13 years to be the only supplier of ports and have their name solely on TV’s. It’s not that the brands wouldn’t put displayport, it’s that HDMI has a very expensive agreement as the only one to be able to put ports on TV’s.

  15. JOe:

    Yes DisplayPort 1.4 can do 4K/60Hz/10 bits/HDR/444.

    HDMI 2.1 can do a whole lot more including 4K/144Hz/10bits/HDR/444 but it’s still very new. It also has its own active sync technology.

    On a separate note, the article failed to mention that iTunes Store has both HDR10 and Dolby Vision titles for rental and purchase.

  16. Even though in UHD Blu-Rays say it’s 4:2:0 chroma subsampling, it’s actually 4:4:4 12 bit encased in the metadata. Almost like a zip file. So depending on your player, PC, TV or monitor and displayport or HDMI cable. If it can’t handle that bandwidth it will downgrade it to 4:2:2 / 4:2:0 10bit or even not play at all, maybe give some issues like stuttering and frame skipping or screen tearing.

  17. for the HDMI 2.0a spec
    you can do 4k 60hz 4:4:4 8bit RGB or YCbCr you can not do 10bit 4:4:4 at 4k 60hz there is not enough bandwidth

    you can do 4:2:2 10bit

    HDR films are 4:2:0 and games are 4:2:2

    your best to leave everything RGB 8bit (PC or Game Console) and when HDR is actually used it will automatically switch to 4:2:2 10bit YCbCr

    yes HDR is part of the HDMI 2.0 standard but the hardware was in receivers before HDR was finalized so most AVRs required a software update to be able to pass HDR through so that is something to keep in mind

    HDMI2.1 will advance alot of these issues but I’m not sure if it’s even a issue

    I would like to be able to play HDR games in 4:4:4 10bit 4K but it’s not a deal breaker

    I myself own a 2016 55″ SAMSUNG KS8000 and a 2015 Yamaha RX-V779

  18. This certainly THE BEST treatise I’ve found that lays out all the elements of 4k, UHD, and “deep color” in one single place, and fits it all together so well.

Leave a Reply to Anon Cancel reply

Your email address will not be published. Required fields are marked *