I don't think I made myself understood.
A SDR PC display monitor is usually configured to apply a gamma transform to the incoming video signal, so that the overall gamma transform is 2.2.
A SDR TV set is usually configured to apply a gamma transform to the incoming video signal, so that the overall gamma transform is 2.4.
Most of us using Theater View do so with a TV display, so our displays make so that overall the gamma transform is 2.4
PC and TV displays in HDR mode no longer apply different tone mapping (which is what the gamma transform is) to the incoming video signal, because HDR defines fixed luminance points (in nits) for each RGB value. It falls on the shoulders of the Windows WDM to emulate the tone mapping of different SDR displays, when it converts the sRGB content of various applications to HDR10 code points, because HDR displays receive these HDR10 values and display them as is, it's no longer their role to apply an additional arbitrary gamma transform to their input like they do in SDR mode.
The Windows DWM, when faced with a buffer containing sRGB content (so, RGB values pre-encoded according to the sRGB gamma) doesn't know that:
- oh, I have to transform the content of this front buffer (of Calc.exe for instance) to HDR RGB values so that the overall image luminance is the same as when the content is displayed on a SDR display that applies a 2.2 gamma overall value
- and oh, I have to transform the content of this other front buffer (of the Theater/Display view window in MC.exe) to HDR RGB values so that the overall image luminance is the same as when the content is displayed on a SDR display that applies a 2.4 gamma overall value
Since Windows is a PC OS, it assumes that it has to apply a mapping that emulates what a PC SDR display monitor does, so a overall 2.2 gamma. What that means is that when we look at SDR movies rendered by JRVR on our TV in HDR mode, we are very likely seeing them with an overall 2.2 gamma applied, instead of the overall 2.4 gamma (which is darker) when we see them on the same TV in SDR mode.
In order for Windows to know that it has to map the front buffer of those two applications in a different manner to HDR code points, the content of the back-buffer needs to be somehow be flagged in a way that lets the DWM know: map the content of the front buffer to HDR the same way a SDR display configured for overall 2.2 gamma would, and map this other front buffer to HDR the same way a SDR display configured for overall 2.4 gamma would. I believe such way of tagging already exists, it's what you are already using to tag the swap chain content as HDR10.
There is of course, still an issue: that slider in the Windows HD Color settings which lets one modify to which HDR code point to map the SDR white. I think by default it's set to map the SDR white to 80 nits, which is usually much less than what people use their SDR displays at. Modifying the mapping to a higher value will also mess with the tone mapping of SDR to HDR.
Btw, I have my TV display configured to use a 2.2 gamma for SDR now, which is why I don't actually see much difference between watching SDR content on the TV in SDR mode or in HDR mode. As long as you use the slider to increase the brightness of the SDR content when viewed in Windows HDR mode, it will look quite similar.