Here’s the problem: most 4K Blu-ray titles available to buy on the market at the moment are graded on 1000-nit mastering monitors, with a small handful mastered on 4000-nit Dolby PRM-4220 Professional Reference Monitor. When a 600-nit HDR TV is asked to display a 1000-nit HDR source video, there are three common ways this can be handled:
- The TV can follow the PQ curve closely up until 600 nits, and then discard the rest of the bright detail from 600 nits to 1000 nits. This is known as clipping.
- The TV can average down the entire source dynamic range from 0 to 1000 nits to be fitted within its 600-nit brightness capability, but the APL of the output will be lower than that of the source, and there’ll be compression (shadow detail/ specular highlights getting obscured).
- The TV can track the PQ curve closely until, say, 450 nits, and then spend the remaining 150 nits to display the detail between 450 and 1000 nits. This is the tone-mapping approach that most manufacturers are using. The highlight detail will still be present, but more compressed (and obviously less bright) than if the same content was displayed on a 1000-nit television with correct PQ implementation.
Aqui hablan en HDTVTEST de los problemas que tienen las teles por falta de nits respecto a las masterizaciones que se vienen haciendo.
Y
como se produce el recorte y como se gestiona el recorte. O sea que haber lo haylo y para el HDR los nits son importantes nos guste o no.