More > JRiver Media Center 29 for Windows
Windows 10 + JRVR + HDR
bogdanbz:
I don't think I made myself understood. :)
A SDR PC display monitor is usually configured to apply a gamma transform to the incoming video signal, so that the overall gamma transform is 2.2.
A SDR TV set is usually configured to apply a gamma transform to the incoming video signal, so that the overall gamma transform is 2.4.
Most of us using Theater View do so with a TV display, so our displays make so that overall the gamma transform is 2.4
PC and TV displays in HDR mode no longer apply different tone mapping (which is what the gamma transform is) to the incoming video signal, because HDR defines fixed luminance points (in nits) for each RGB value. It falls on the shoulders of the Windows WDM to emulate the tone mapping of different SDR displays, when it converts the sRGB content of various applications to HDR10 code points, because HDR displays receive these HDR10 values and display them as is, it's no longer their role to apply an additional arbitrary gamma transform to their input like they do in SDR mode.
The Windows DWM, when faced with a buffer containing sRGB content (so, RGB values pre-encoded according to the sRGB gamma) doesn't know that:
- oh, I have to transform the content of this front buffer (of Calc.exe for instance) to HDR RGB values so that the overall image luminance is the same as when the content is displayed on a SDR display that applies a 2.2 gamma overall value
- and oh, I have to transform the content of this other front buffer (of the Theater/Display view window in MC.exe) to HDR RGB values so that the overall image luminance is the same as when the content is displayed on a SDR display that applies a 2.4 gamma overall value
Since Windows is a PC OS, it assumes that it has to apply a mapping that emulates what a PC SDR display monitor does, so a overall 2.2 gamma. What that means is that when we look at SDR movies rendered by JRVR on our TV in HDR mode, we are very likely seeing them with an overall 2.2 gamma applied, instead of the overall 2.4 gamma (which is darker) when we see them on the same TV in SDR mode.
In order for Windows to know that it has to map the front buffer of those two applications in a different manner to HDR code points, the content of the back-buffer needs to be somehow be flagged in a way that lets the DWM know: map the content of the front buffer to HDR the same way a SDR display configured for overall 2.2 gamma would, and map this other front buffer to HDR the same way a SDR display configured for overall 2.4 gamma would. I believe such way of tagging already exists, it's what you are already using to tag the swap chain content as HDR10.
There is of course, still an issue: that slider in the Windows HD Color settings which lets one modify to which HDR code point to map the SDR white. I think by default it's set to map the SDR white to 80 nits, which is usually much less than what people use their SDR displays at. Modifying the mapping to a higher value will also mess with the tone mapping of SDR to HDR.
Btw, I have my TV display configured to use a 2.2 gamma for SDR now, which is why I don't actually see much difference between watching SDR content on the TV in SDR mode or in HDR mode. As long as you use the slider to increase the brightness of the SDR content when viewed in Windows HDR mode, it will look quite similar.
Hendrik:
Differences in gamma do not really account for that. Not for the small difference of 2.2 and 2.4. Its a noticeable difference on a good screen, but it would not introduce actual black crush or artifacting as described above.
There is also an option in JRVR to change the gamma for output, which by default preserves the encoded gamma in the video for SDR without further alterations, as thats what most people expect to see, which for most videos is closer to 2.2 then 2.4 anyway.
Such an explanation also does not account at all for the fact that Windows 11 somehow does it just better. That alone speaks to the fact that Windows 10 just doesn't do it all that well.
bogdanbz:
2.2 to 2.4 is a large difference. It affects values near black where our senses perceive a very large gap for very small luminance deltas, in total contrast to how we don't notice the same amount of variance between larger luminance values. And even 2.2 to sRGB gamma is not a small difference, the normal 2.2 is visibly darker near black than sRGB.
I agree that the black crush and artifacting is most likely due to the TV handling of near-black in HDR, not due to SDR to HDR mapping in Windows. Although a brighter tone mapping as that performed by Windows trying to emulate a 2.2 SDR display would reveal macroblocks that would otherwise be harder to notice on a SDR display applying a 2.4 gamma tone mapping.
I don't know what Windows 11 does differently, maybe there are some issues with the SDR to HDR color mapping in Windows 10 that are fixed in Windows 11 indeed. But the tone mapping of SDR to HDR has to follow the same process as described by me above, Windows has to know what kind of SDR gamma to emulate between all the possible gamma values when it does this SDR to HDR mapping, and it can't know that without some hints on the front buffers regarding the content.
The alternative is for the application itself to use a swap chain flagged as HDR10, where it puts SDR content tone-mapped by itself to HDR10. Then the Windows DWM doesn't need to apply an arbitrary tone map to this content anymore.
jmone:
FWIW, I've upgraded all my HTPC to Windows 11 and now leave OS HDR On all the time (they all have HDR displays). I've no longer had any issues (like I did with Windows 10) with odd looking SDR. It just works, and works surprisingly well.
Navigation
[0] Message Index
[*] Previous page
Go to full version