The following is taken directly from the ST2084 (PQ EOTF) specification.
This EOTF (ST2084) is intended to enable the creation of video images with an increased luminance range; not for creation of video images with overall higher luminance levels. For consistency of presentation across devices with different output brightness, average picture levels in content would likely remain similar to current luminance levels; i.e. mid-range scene exposures would produce currently expected luminance levels appropriate to video or cinema.
Referring to PQ as an 'absolute' standard means that for each input data level there is an absolute output luminance value, which has to be adhered to. There is no allowance for variation, such as changing the gamma curve (EOTF), or increasing the display's light output, as that is already maxed out.
(This statement ignores dynamic metadata, more on which later.
The following table shows an example PQ EOTF for a 1000 nit TV.
|Input Data - 10 bit||Output Luminance - nits|
Understanding the above regarding the need, or not, for metadata brings up an interesting thought on how different HDR displays could be compared.
Assuming that modern HDR displays can all reach a given minimum peak luma level, say 1000 nits, an HDR source mastered at 1000 nits, using a gamut that is within the UHD specification of 90% P3, should not trigger any 'metadata' based processing (tone mapping/roll-off)within the playback display, allowing for a direct comparison of the different display's underlying calibration accuracy.
From this first comparison, it would then be a relatively simple additional step to compare the same displays with different source material mastered at 2000+ nits, with full P3 gamut, enabling the display's tone mapping/roll-off to be compared separately.
Based on this concept, we have added a 'Colour Sub-Space capability to LightSpace, for example enabling P3 to be profiled within a Rec2020 container.
The following statement is taken from Dolby's own 'Dolby Vision for the Home' white paper.
"The current TV and Blu-ray standards limit maximum brightness to 100 nits and minimum brightness to 0.117 nits..."
Unfortunately, at best this is an inaccurate statement, at worse it is marketing hyperbole, as the Blu-ray format has no such limits for min or max brightness levels, as these values are defined by the display's set-up - remember SDR is a relative standard, not absolute. The minimum level (the black level) is usually just the minimum the display can attain, and can range from very dark (0.0001 nits for example) on OLED displays to higher levels (around 0.03 nits or even higher) on cheaper LCD displays. The maximum brightness is often set far higher on home TVs to overcome surrounding room light levels, with many home TVs set to 300 nits, or more.
Note: The statement that 'The minimum level (the black level) is usually just the minimum the display can attain' refers to the fact that often OLED black can be too low, and users often chose to lift it to prevent shadow detail clipping/crushing, and that becomes even more apparent with home HDR OLEDs.
When the original SDR Blu-ray material is graded, the displays used will be calibrated to 80-120 nits (100 nits being the common average value), within a controlled grading environment (a dark environment), with the black level being from around 0.001-0.03 nits, depending on the display used (although the higher value is often used to maintain 'pleasant' images when viewed on the wider range of home TVs, with variable black levels!). And as mentioned above, when the Blu-ray is viewed in a home environment it is often necessary to set the TV to brighter levels to overcome surrounding room light levels.
The reality is PQ based HDR does nothing for black levels, and that is true of shadow detail too - no matter what those less knowledgeable or marketing material may say.
A good example of inaccurate information used to promote 'benefits' of HDR can be seen in this presentation on YouTube, where improved shadow detail was stated as being an example of the benefits HDR brings over SDR... which is incorrect. The reality is the SDR image is probably just poorly graded, even potentially deliberately so, to promote HDR. HDR provides no such benefit over SDR shadow detail.
And in reality, due to the EOTF curve in use on PQ based HDR the black under normal home viewing conditions will often be 'crushed' when compared to SDR versions of the same image. This is born out by the surround illumination level that is specified as being preferred for HDR as being 5 nits, while for SDR it was originally specified as 10% of the maximum brightness of the display. That large discrepancy, and shows that HDR black/shadows will often be washed-out/clipped when viewed in any environment where the ambient light levels cannot be controlled.
In reality a 10 bit SDR image will have potentially better black/shadow detail than a PQ based HDR image.
Different viewing environments really need different display gamma, which the 'Absolute' PQ based HDR standard cannot address.
There is a further potential issue with black levels, as no display can hit zero black, and so would natively clip the input signal at the bit value relative to the display's min black, due to the 'absolute' nature of the PQ EPTF. This means any PQ display requires some form of 'shadow' roll-off to prevent clipping, but this in-turn will exasperate shadow crushing on displays with higher black levels.
This issue has been widely ignored, and yes can be the cause of very poor shadow clipping/crushing of HDR displays with poor PQ EOTF implementation.
It is worth reiterating again that no matter what is said elsewhere, no HDR standard can produce 'darker blacks', as they are set by the max black level the display technology can attain, and the present day SDR (Standard Dynamic Range) Rec709 standard already uses the minimum black attainable on any given display.
It is worth pointing out that due to the logarithmic response of the human eye to changes in light levels, the present day SDR (Standard Dynamic Range) Rec709 'standard' of 100 nits is actually around 50% of PQ based HDR's 10,000 nits peak level.
(Note: 'standard' is in commas as Rec709 is a relative standard, and so scaling the peak luma levels to overcome environmental light issues is an acceptable approach, while PQ HDR is an absolute nits based standard, and so cannot be scaled)
The following image shows the reality of this when referenced to different peak white levels.
The higher the resolution, the shorter the viewing distance needs to be from the screen.
Conversely, the greater the viewing distance, the lower the actual display resolution can be for the same apparent image resolution/quality.