Not sure if this is a pure MadVR question more suited to the Doom9 thread, or if MC is involved, as MC is also involved in switching display modes.
I have an LG E6 and an HTPC with an Nvidia GTX1050 GPU (390.65 driver). In Nvidia control panel, if I select 2160p YCbCr 4:2:2 and tell MadVR I have a 10-bit capable display all works well for HDR video.
I noticed by accident that if I change Nvidia video mode to RGB 8-bit then when I play a 60Hz HDR test clip (eg. LG's "Chess") I was getting "sparkles", showing HDMI errors. I then realised from the MadVR HUD that it was still in D3D11 Exclusive
10-bit mode, which should not be possible in RGB @60Hz, using legal HDMI 2.0 modes. RGB 10-bit would need a 22.3Gbps bitrate.
Unfortunately the LG E6 gives no useful information on the signal it is receiving, so I tried to diagnose this by hooking my HTPC up to my Oppo 203's HDMI input, as the Oppo can give comprehensive info on both the signal it is receiving an outputting. However, when I did this I noticed that MadVR would then say it was outputting D3D11 Exclusive 8-bit. It seems the Oppo was somehow forcing MadVR/The Nvidia driver to use only legitimate HDMI 2.0 modes.
Maybe this is not really a problem - It seems the E6 is insufficiently prescriptive about the signal it can receive and MadVR is merely doing what it is told it can. Still, it seems odd that either MadVR or Nvidia would try to use a mode not allowed by HDMI 2.0.
I'm anything but an expert on EDID, but I did put the EDID raw data from MadVR into
http://www.edidreader.com/ and do "a bit of reading online(!)" and I cannot see how the source device knows the max bit depth for any display mode other by "just knowing" the HDMI 2.0 spec, in which case it should know not to allow 4K/60 RGB 10-bit. I can see where is says it supports a max TMDS of 600Mhz, that it supports 2160p/60 and that, generally, untied to any specic VID, it supports up to 12-bit. I don't see how it excludes higher bit depths @50/60Hz than allowed by HDMI 2.0.
Anyone have any insights or thoughts (or interest!) in this somewhat nerdy question
. It seems, in an ideal world, MC would not try to output video modes that are not supported by HDMI 2.0, even when the user configures his PC in a less than ideal way.