Devices > Video Cards, Monitors, Televisions, and Projectors
madVR able to output non-HDMI 2.0 (HDR) Video Mode?
Jong:
Not sure if this is a pure MadVR question more suited to the Doom9 thread, or if MC is involved, as MC is also involved in switching display modes.
I have an LG E6 and an HTPC with an Nvidia GTX1050 GPU (390.65 driver). In Nvidia control panel, if I select 2160p YCbCr 4:2:2 and tell MadVR I have a 10-bit capable display all works well for HDR video.
I noticed by accident that if I change Nvidia video mode to RGB 8-bit then when I play a 60Hz HDR test clip (eg. LG's "Chess") I was getting "sparkles", showing HDMI errors. I then realised from the MadVR HUD that it was still in D3D11 Exclusive 10-bit mode, which should not be possible in RGB @60Hz, using legal HDMI 2.0 modes. RGB 10-bit would need a 22.3Gbps bitrate.
Unfortunately the LG E6 gives no useful information on the signal it is receiving, so I tried to diagnose this by hooking my HTPC up to my Oppo 203's HDMI input, as the Oppo can give comprehensive info on both the signal it is receiving an outputting. However, when I did this I noticed that MadVR would then say it was outputting D3D11 Exclusive 8-bit. It seems the Oppo was somehow forcing MadVR/The Nvidia driver to use only legitimate HDMI 2.0 modes.
Maybe this is not really a problem - It seems the E6 is insufficiently prescriptive about the signal it can receive and MadVR is merely doing what it is told it can. Still, it seems odd that either MadVR or Nvidia would try to use a mode not allowed by HDMI 2.0.
I'm anything but an expert on EDID, but I did put the EDID raw data from MadVR into http://www.edidreader.com/ and do "a bit of reading online(!)" and I cannot see how the source device knows the max bit depth for any display mode other by "just knowing" the HDMI 2.0 spec, in which case it should know not to allow 4K/60 RGB 10-bit. I can see where is says it supports a max TMDS of 600Mhz, that it supports 2160p/60 and that, generally, untied to any specic VID, it supports up to 12-bit. I don't see how it excludes higher bit depths @50/60Hz than allowed by HDMI 2.0.
Anyone have any insights or thoughts (or interest!) in this somewhat nerdy question :). It seems, in an ideal world, MC would not try to output video modes that are not supported by HDMI 2.0, even when the user configures his PC in a less than ideal way.
Hendrik:
Just because madVR renders a 10-bit image does not mean that this is what is going to go over the cable. In fact you can always use 10-bit rendering mode, no matter if any of your hardware actually supports it. When its not supported, the driver will simply reduce the bitdepth of the image.
Its really quite similar to you using YCbCr 4:2:2 output. madVR still renders RGB, which is 4:4:4 by definition. So something has to change the image - the graphics driver.
No software gets close to being able to control what actually goes out the HDMI port, the driver is fully in charge of that.
Jong:
Thanks @Hendrik,
It's interesting though, isn't it, that when the Oppo is in the chain MadVR does switch to use 8-bit mode, however without it MadVR uses 10-bit mode and there are visible HDMI errors that don't occur with 4K/60 YCbCr 4:2:2 12-bit or RGB 8-bit (both 18Gbps). It does "seem" like an illegal bitrate is being used. Sounds like, if it is, it's the driver's fault. But any thoughts why MadVR behaves differently with/without the Oppo?
Hendrik:
madVR in-depth questions should really be asked over at Doom9 so madshi can answer them.
Jong:
Yeah, that sounds fair. Just didn't want to ask there and find that MC's display switching mechansim was in play.
Navigation
[0] Message Index
[#] Next page
Go to full version