I admit HDR is confusing, especially on projectors but, from what I understand... Here are the facts and what I see.
First, my projector don't switch to HDR input if HDR is not set in Windows OR a full screen app doesn't make it to switch (which MC does, but the interaction with Windows HDR is not clear or bugged). If none of the above, the projector stays in MHL input. I have read that only native HDR game are able to switch to HDR in full screen. I suspect that this is a feature of the full screen mode of Windows for any app.
Second, when HDR is enabled in Windows, it's clear that the picture "looks" HDR. No matter if it's the desktop or a Youtube video, the projector switches alone to HDR. If not, it says in MHL.
Third, I have configured myself my projector with a probe and a calibration software after reading on various forums. The first thing you have to set is the 100% white... to white. RGB all at the same level. This give you the maximum luminance. The HDR "effect" depends at which level of white you clip the gamma curve. In my case 80% white. The maximum luminance is reached at 80% white (and not 100% as a straight gamma curve). The result is a "S" gamma curve. The probe reads the correct levels of RGB for each level of white. Indeed, enabling "Dynamic Black" on my projector adds luminance by pushing the blue. But, the HDR effect works very well without boosting the blue. I don't say that I fully understand how it works technically but, visually I see clearly the difference when HDR is enabled in Windows or not.
So, it does not matter who passes HDR metadata … OS or Nvidia … result is same - PC does nothing and TV does tone mapping
That's what is not clear for me. In a movie the metadata are set by the producer. In Windows there should be some generic interpolation. As I said, switching Windows to HDR, switches my projector to HDR and the differrence is clearly visible. So what happen to native HDR? That's my question.
you are confusing yourself … in once instance you want to bypass OS HDR … in other instance you want OS to handle HDR
Yes, I don't want to switch manually in Windows and as I said, switching introduces bug in MC. If MC supports "windowed" full screen (like my games) it could let Windows handling HDR and in "real" full screen mode MC handles itself the HDR. Why when the play menu or the right click menu is displayed in fully screen mode and HDR is enabled in Windows, MC switches to HDR (which was clearly not the case before)? Because MC leaves full screen mode and Windows HDR take the lead?
The only answer I see on this forum is to disable Windows HDR and let MC do the job. Why? No answer? I would like to choose my self and do the comparison. If I disable Windows HDR my desktop looks crappy (and MC HDR in full screen works but, not very stable). If I enable Windows HDR, MC doesn't switch to HDR in full screen mode (and seems to use Windows HDR in windowed mode). These are the facts.
Ideally, for me, HDR should be always on in Windows and MC should take the control in full screen mode as it does now when HDR is disabled in Windows. Is it possible?
EDIT: I have noticed that the Windows slider that configure the amount of HDR in Windows preferences changes the clipping value of the "S" curve. From straight curve to "S" curve.