Extraído del foro francés hdfever Forum HDfever.fr - Consulter le sujet - Samsung JS9500 : TV UHD Full LED, Micro Dimming, Q.Dot, HDR
Gracias YNOS

Bien, después de investigar un poco, este tema de "true/native" 10 bit vs driven/processing y basándome por ejemplo en esta información:

Bitdepth

Bitdepth is one of the most important considerations when choosing a budget display. Fortunately, the rule is simple: never buy anything with less than a 10bit panel.

To explain the difference extremely briefly, 8bit means a video codec or display is only capable of 16.7 million colors. That sounds like a lot until you compare it to the 1.07 billion colors a 10bit panel (or codec) is capable of. It's quickly apparent why a 10bit panel is so superior to 8bit. This is especially true when working in codecs that are 10bit (or greater). An 8bit monitor would be incapable of displaying all the colors!

A result of the fewer colors can be the introduction of banding in your images. See the below image for an example:

Compare the 8bit images (top) with their 10bit counterparts (bottom). If you look closely, you can see the tiny bars that appear in the 8bit palettes. These are introduced because there aren't enough colors to accurately display the whole spectrum. Compare this to the smooth gradient the 10bit images display with their larger amount of colors. This is a simulated comparison, but fairly accurate as to what you might see when comparing an 8bit/10bit display.

There are some monitors which boast about their "10bit processing." Generally, these are using clever electronics to mask banding and other 8bit issues while still using an 8bit panel. This does not overcome the color difference. If a monitor has a 10bit panel, the manufacturer is going to proudly display that fact.

Equally important to having a display capable of 10bit is ensuring your connection and video card are capable of sending a 10bit signal. If a single part of the chain is only capable of 8bit, then the output will be 8bit.

DVI is (generally) not capable of 10bit (though a few monitors have an implementation), DisplayPort and HDMI are. NVIDIA GeForce video cards, while able to output 10bit in DirectX mode, are effectively 8bit in Premiere, etc falling back to OpenGL rendering. NVIDIA Quadro cards are 10bit because of their 10bit color buffers. AMD is similar with their Radeon cards only outputting 8bit. Their proline FirePro and FireGL cards are a must to achieve full 10bit capability.
Link:

Let's Talk Monitors: What Makes a Good Display

Por resumir y por orden de calidad:

1.- 10 bit real que entiendo correspondería a las gamas con HDR JS9500, JS9000, JS8500. Muy superior a paneles de 8 bits, 1,07 billones de colores vs "sólo" 16.7 millones

2.- 10 "bit driven" en realidad paneles de 8 bits, que utilizan diferentes procesados, para disimular algunas de las carencias que sufren los paneles de 8 bits "nativos" (banding por ejemplo) pero siendo inferior en cuanto a reproducción de los colores que un panel "nativo" de 10 bits. Entiendo que estos paneles los montarían las gamas medias JU7500, JU7000, JU6670.

3.- 8 bits "a secas", que es incapaz de mostrar todos los colores con toda la precisión de los paneles 10 bit "reales" o "simulados" entiendo que serian las gamas JS6XXX.

Así, lo entiendo yo pero aviso de antemano que salvo 4 conceptos básicos, no domino para nada este tipo de temas técnicos.

Agradeceré cualquier aclaración o corrección de parte de los que entiendan más .

Saludos