INTERACT FORUM
Devices => Video Cards, Monitors, Televisions, and Projectors => Topic started by: Jong on February 21, 2018, 06:40:43 am
-
Not sure if this is a pure MadVR question more suited to the Doom9 thread, or if MC is involved, as MC is also involved in switching display modes.
I have an LG E6 and an HTPC with an Nvidia GTX1050 GPU (390.65 driver). In Nvidia control panel, if I select 2160p YCbCr 4:2:2 and tell MadVR I have a 10-bit capable display all works well for HDR video.
I noticed by accident that if I change Nvidia video mode to RGB 8-bit then when I play a 60Hz HDR test clip (eg. LG's "Chess") I was getting "sparkles", showing HDMI errors. I then realised from the MadVR HUD that it was still in D3D11 Exclusive 10-bit mode, which should not be possible in RGB @60Hz, using legal HDMI 2.0 modes. RGB 10-bit would need a 22.3Gbps bitrate.
Unfortunately the LG E6 gives no useful information on the signal it is receiving, so I tried to diagnose this by hooking my HTPC up to my Oppo 203's HDMI input, as the Oppo can give comprehensive info on both the signal it is receiving an outputting. However, when I did this I noticed that MadVR would then say it was outputting D3D11 Exclusive 8-bit. It seems the Oppo was somehow forcing MadVR/The Nvidia driver to use only legitimate HDMI 2.0 modes.
Maybe this is not really a problem - It seems the E6 is insufficiently prescriptive about the signal it can receive and MadVR is merely doing what it is told it can. Still, it seems odd that either MadVR or Nvidia would try to use a mode not allowed by HDMI 2.0.
I'm anything but an expert on EDID, but I did put the EDID raw data from MadVR into http://www.edidreader.com/ and do "a bit of reading online(!)" and I cannot see how the source device knows the max bit depth for any display mode other by "just knowing" the HDMI 2.0 spec, in which case it should know not to allow 4K/60 RGB 10-bit. I can see where is says it supports a max TMDS of 600Mhz, that it supports 2160p/60 and that, generally, untied to any specic VID, it supports up to 12-bit. I don't see how it excludes higher bit depths @50/60Hz than allowed by HDMI 2.0.
Anyone have any insights or thoughts (or interest!) in this somewhat nerdy question :). It seems, in an ideal world, MC would not try to output video modes that are not supported by HDMI 2.0, even when the user configures his PC in a less than ideal way.
-
Just because madVR renders a 10-bit image does not mean that this is what is going to go over the cable. In fact you can always use 10-bit rendering mode, no matter if any of your hardware actually supports it. When its not supported, the driver will simply reduce the bitdepth of the image.
Its really quite similar to you using YCbCr 4:2:2 output. madVR still renders RGB, which is 4:4:4 by definition. So something has to change the image - the graphics driver.
No software gets close to being able to control what actually goes out the HDMI port, the driver is fully in charge of that.
-
Thanks @Hendrik,
It's interesting though, isn't it, that when the Oppo is in the chain MadVR does switch to use 8-bit mode, however without it MadVR uses 10-bit mode and there are visible HDMI errors that don't occur with 4K/60 YCbCr 4:2:2 12-bit or RGB 8-bit (both 18Gbps). It does "seem" like an illegal bitrate is being used. Sounds like, if it is, it's the driver's fault. But any thoughts why MadVR behaves differently with/without the Oppo?
-
madVR in-depth questions should really be asked over at Doom9 so madshi can answer them.
-
Yeah, that sounds fair. Just didn't want to ask there and find that MC's display switching mechansim was in play.
-
By “sparkles” you mean random white dots on screen ... if it is, then your HDMI cable does not have enough bandwidth to do 2160p in RGB (or 4:4:4) at 60Hz. Problem range from sparkles to completely blank screen.
I went through same problems on my C6 and E6. Finally after several so called HDMI 2.0 cables I got certified one and problem went away.
Why you want to run RGB (or 4:4:4) instead of 4:2:0 that Blu-ray Disc use of 4:2:2 that bluray players output? Because you want MadVR to do chroma upsampling instead of TV. That’s the whole pint of MadVR.
If you output from PC to E6 in 4:2:2, MadVR will do its scaling in 4:4:4 ... then downsample to 4:2:2 to match your output setting ... then your TV will do upsampling to 4:4:4 again.
That been said ... I run my desktop in 60hz for screen to be more responsive (in 1080p for 3D to work automatically ... that’s another story) ... and let MC switch refresh rate to match content (also let MC switch to 2160p for 2D content ... but as said before, completely different topic) ... but all in 4:4:4
-
By “sparkles” you mean random white dots on screen ... if it is, then your HDMI cable does not have enough bandwidth to do 2160p in RGB (or 4:4:4) at 60Hz. Problem range from sparkles to completely blank screen.
I went through same problems on my C6 and E6. Finally after several so called HDMI 2.0 cables I got certified one and problem went away.
Why you want to run RGB (or 4:4:4) instead of 4:2:0 that Blu-ray Disc use of 4:2:2 that bluray players output? Because you want MadVR to do chroma upsampling instead of TV. That’s the whole pint of MadVR.
If you output from PC to E6 in 4:2:2, MadVR will do its scaling in 4:4:4 ... then downsample to 4:2:2 to match your output setting ... then your TV will do upsampling to 4:4:4 again.
That been said ... I run my desktop in 60hz for screen to be more responsive (in 1080p for 3D to work automatically ... that’s another story) ... and let MC switch refresh rate to match content (also let MC switch to 2160p for 2D content ... but as said before, completely different topic) ... but all in 4:4:4
Yeah, I get the cable is not coping.
The point is that it does cope @4K/60 RGB 8-bit and YCbCr 4:2:2 12-bit, i.e. 18Gbps. The problem is with 60fps HDR10 content the driver seems to be outputting an illegal 4K/60 RGB 10-bit that would require 22.3Gbs, more than any HDMI 2.0 cable or equipment is spec'd for.
We really do need to use YCbCr 4:2:2 if we want to do HDR10 at all refresh rates, because there is no HDR10 capable 4K/60 RGB video mode in HDMI 2.0. I know it involves more YCbCr->RGB>YCbCr conversions but sadly it's unavoidable at the moment. It was only an accident I happened to test with RGB. Now I understand it looks like a driver bug and nothing to do with MadVR or MC :).
-
For the record, on most TVs you can do HDR also with RGB 8-bit, which should still have decent quality.
-
Yes, thanks. For the limited amount of 50/60Hz HDR material I could probably live with that and that's I guess what the driver should be sending for 4K/60 RGB. But I wouldn't want to use it unnecessarily for all HDR and it's not workable if the driver outputs 10-bit RGB for 24fps AND 50/60Hz.
But I'll try a full driver reinstall and if that doesn't fix it hopefully a future driver update will.
-
hmmm ... i never came across 2160p at 60fps material
NVIDIA control panel does not allow switching RGB 10bit 2160p60/50 on my 1070 connected to E6 ... so its surprising that MadVR/MC supposed to be able to do so
i supposed you can create profile in MadVR to run 2160p50/60 in D3D9 as that only outputs 8 bit
or profile for E6 properties that say it support only 8 bit for 2160p50/60 content
-
You are right to say there is precious little "real", i.e. not demo 2160p/60 or 2160p/50 (same HDMI bitrate) HDR stuff right now. For most HTPC users 24p/25p is by far the most important. But in future most live broadcasts - sport and concerts etc. will need it (before HDR takes off anyway!).
Yeah, what I am seeing shouldn't be possible, or rather technically it is but it's outside of HDMI 2.0.
I've concluded it's a driver bug. It should reduce what it gets from MadVR down to 8-bit to stay in spec and it's not.
It's easy to stop MavVR doing 10-bit. In fact it does 8-bit by default. But, if you want HDR10 for 24p and you don't want to reconfigure just to play a different frame rate it can get a bit messy! But, anyway, it's pretty clear now it's not a problem with MC or MadVR, so I'll just work around it til it's fixed.
-
not sure about US ... but from thailand curent HD stuff from cable or sattelite is 720p or 1080i ... so i doubt they will do 2160p50/60
... only potential problem for me is upscaling dvd to 4k and running it in 50/60hz ... i think atm i run some profile in madVR to force it into 8bit at 2160p50/60
-
Several things:
- If you move the mouse to the top of the screen during playback and the JRiver overlay comes up, madVR will always drop into 8-bit output mode.
- With Windows 10, you don't need to use exclusive mode for 10-bit output. I prefer Windowed mode so you aren't dropping in/out of exclusive. I haven't noticed any improvement to queues or quality when using exclusive mode.
- Some displays require YCbCr 4:2:2, 2160p, 10-bit or higher to trigger HDR mode. It is better when using madVR to use the option "convert HDR content to SDR by using pixel shader math." This does a frame by frame analysis of the image, allows one to preserve the wide color gamut, and can gives one nice control over the tone mapping curve. Another benefit is that you can still use 2160p, RGB, 10-bit output to the display to prevent compression by the graphics driver.