INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: Auto SDR / HDR switching on Intel integrated graphics and MadVR development end  (Read 2514 times)

Mark_Chat

  • Recent member
  • *
  • Posts: 43

I thought about posting this in the hardware section, but, really, this is to purely to do with Media Center programming itself.

Basically, JRiver needs a solution to directly passthrough full screen HDR content to integrated Intel Graphics and switch the Iris UHD 630 (from 2017 and later) chipset into 10/12 bit HDR mode automatically.
(I am not referring to switching Windows into the awful permanent HDR mode that makes it unable to passthrough SDR content and which destroys normal Windows and SDR source colour saturation.)

The HTPC desktop needs to be left running in 8 bit SDR mode for normal PC and SDR content but then have HDR sources output as 10/12 bit HDR video.
Mad VR does this with NVidia but not with Intel graphics and scouring these forums and the internet, Intel and Windows have been blamed for not sorting out the solution, but this is NOT the case, as Kodi does it with Windows running in SDR mode and with the regular Intel Graphics drivers.
https://forum.kodi.tv/showthread.php?tid=349861

I suspect Madshi has stopped developing MadVR as there is no profit in it for him, and is focussing instead on the Hardware solutions (The MadVR Envy sells for $7000 a pop!)
The last MadVR release was the "current" 0.92.17 which was September 2018, so I am pretty sure this is close to the truth.

Problem: MadVR does not integrate with the Intel Integrated graphics driver to switch the 630 Iris Pro graphics into it's full screen HDR mode.
JRiver relies on MadVR and so similarly cannot do this either. (and likely never will unless MadVR development restarts)

Intel Graphics users (this includes the JRiver Id which uses an Intel NUC) need an alternative solution to MadVR built into Media Center - i.e. Red October Standard needs a rewrite / reimplementation.

This issue means that for a while now, I have been switching Windows manually to HDR mode for HDR sources and mkv as this is the only way to do it effectively.
(I don't want MadVR to be tone mapping HDR content into an 8 bit signal as my TV then doesn't switch to it's HDR mode and ramp up the brightness and local dimming and this means I am not getting HDR at all and certainly not the 1000 Nits my QLED can deliver.)

Basically, I need my media player to output 8 bit video for SDR content and 10/12bit for HDR content. JRiver cannot do this with Intel Graphics.
This has been possible in Media Center using the NVidia graphics card driver settings for a while so prior to paying to upgrade my license to Media Center 28, I looked for other options, and Kodi popped up.

It is a JRiver Media Center issue, not a Windows or Intel failing as I confirmed that it works as expected with Kodi, which I installed to check out this weekend.

With Kodi, Windows is left in default SDR mode and PC and SDR playback are at normal 8 bit.
However, play an HDR file and auto passthrough of full screen 12 bit HDR via Intel Graphics to my Samsung works just fine with Kodi.
My TV turns itself into HDR mode and ramps up it's brightness every time I play an HDR source. (It is not just a case of having the TV OSD show an HDR badge when it receives and HDR signal!)

I appreciate some people rave about MadVR tone mapping and customisation, perceiving the colours to be better than what their screen achieves from an HDR coded signal, but for me, my Samsung (possibly with it's more recent software than the 2018 MadVR!!!!) does a great job with HDR, and is presumably tuned by the manufacturer to what my panel can actually achieve.

See the KODI documentation here - HDR passthrough has been enabled since 2019.
https://forum.kodi.tv/showthread.php?tid=349861

Any chance that Red October Standard can be changed to do the same?
(Probably by using DirectX11 or later and current API built into Windows and the Intel Graphics drivers)
Logged

Hendrik

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 10710

Unfortunately, we don't have any immediate plans in that area to announce. Its a complicated topic and not something we can just change in a few days.
Long-term, we're evaluating how we can create a more independent video pipeline, but there is no time frame on any changes.
Logged
~ nevcairiel
~ Author of LAV Filters

Manfred

  • Citizen of the Universe
  • *****
  • Posts: 1023

I would agree that madVR is an issue. It also does not support the macOS platform. The new apple silicon chips are very powerfull also the internal GPU's but MC does not make use of them. MC video engine on the mac is the reason for me to stay with Windows and NVIDIA. If there would be a better video engine on the mac platform I would switch.  The new Mac mini M1 is a great device.
Logged
WS (AMD Ryzen 7 5700G, 32 GB DDR4-3200, 2x2 TB SDD, LG 34UC98-W)-USB|ADI-2 DAC FS|Canton AM5 - File Server (i3-3.9 GHz, 16GB ECC DDR4-2400, 46 TB disk space) - Media Renderer (i3-3.8 GHz, 8GB DDR4-2133, GTX 960)-USB|Devialet D220 Pro|Audeze LCD 2|B&W 804S|LG 4K OLED )

Mark_Chat

  • Recent member
  • *
  • Posts: 43

Dear Hendrik,
That was very helpful and I can understand that writing a new non propriety video pipeline would be a huge undertaking.

If MadVR development doesn't move beyond the current 2018 iteration then a lot of video players will face the same dilemma at some point.
As Manfred posted, integrated graphics capabilities are plenty fast enough now for 4K HDR, so until 8K sources become a thing, gaming graphics cards are simply not needed as long as the integrated graphics can be leveraged.
I only have a 35W Intel 9600T with Intel UHD 630 Graphics in a ITX fanless case, so no overclocking, and the CPU and iGPU is plenty fast enough to handle everything so far, including Red October HQ.

Many thanks for the personal and quick response.
Logged

tij

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1557

https://yabb.jriver.com/interact/index.php?topic=116703.0

NUC are OK for non critical viewing ... NVidia Shield probably cheaper

But there is reason why ppl put 1080ti in their htpc ... its the same reason ppl buy Lumagen ... the quality is on another level.

I agree that it does not look good on MadVR front. Understandably with developers trying to find balance so that MadVR does not hurt their commercial counterpart. I have to admit that i wont buy Envy if there is HTPC version (even if that version is paid) until i am too old to bother with setting it up ... and even then Lumagen seems more attractive prise wise.

Kodi moved away from madVR long time ago ... and Plex was never on it. MadVR is not necessary component of video players ... its just the one with highest quality even though it has not release stable version for years.
Logged
HTPC: Win11 Pro, MC: latest 31(64b), NV Driver: v425.31, CPU: i9-12900K, 32GB RAM, GeForce: 2080ti
Screen: LG 2016 E6
NAS: FreeNAS 11.1, SuperMicro SSG-5048R-E1CR36L, E5-1620v4, 64GB ECC RAM, 18xUltrastar He12-SAS3 drives, 2x240GB SSD (OS)

Manfred

  • Citizen of the Universe
  • *****
  • Posts: 1023

An additional MC option for video output on a TV could be, that the TV does the up-scaling etc. Modern TV's like Sony A90J have a video processors with a quality which is for most people (guess >80%) enough.

For PC monitors its a different discussion.
Logged
WS (AMD Ryzen 7 5700G, 32 GB DDR4-3200, 2x2 TB SDD, LG 34UC98-W)-USB|ADI-2 DAC FS|Canton AM5 - File Server (i3-3.9 GHz, 16GB ECC DDR4-2400, 46 TB disk space) - Media Renderer (i3-3.8 GHz, 8GB DDR4-2133, GTX 960)-USB|Devialet D220 Pro|Audeze LCD 2|B&W 804S|LG 4K OLED )

Mark_Chat

  • Recent member
  • *
  • Posts: 43

https://yabb.jriver.com/interact/index.php?topic=116703.0

NUC are OK for non critical viewing ... NVidia Shield probably cheaper

This is plain wrong.
I'm not sure what non-critical viewing is, but it probably isn't relevant to the solution anyway.
(For me I think it would apply to any video where I don't notice any shortcomings while watching, even if I focus on the quality of the presentation rather than the content)
These days it is the source that limits the quality on almost all modern hardware. No graphics processing available is going to make Zach Snyders Justice league "pop".

Intel's integrated graphics solutions of the modern processors can do more than enough for a 4K HTPC.
Your posting of a link to a three year old NUC review shows a little naïvity regarding this - it irrelevantly refers to a 5 generations old CPU with an iGPU processing power only 20-25% of the GigaFlops of the latest solutions.
Logged

tij

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1557

Non critical viewing ...  for example, I am cooking while watching TV.  Critical viewing ... i sit with pop corn with all light dim down and enjoing movie.

And its not plain wrong ... its the truth.

If you want absolutely best ... then iGPU wont cut it ... not even close.

There is a reason why MC included power hungry MadVR ... my 1070 is barely keeping up with 4K@60 UHD.

And the source was always limiting factor ... not just these days.

PS. I can give you reasons why you need computational power for video processing ... especially 4K as number of pixels quadriple compare to HD ... but i have a feeling you wont believe me.

Seeing is believing ... easiest is to go to Lumagen/MadVR dealers for demo.

If you have powerful GPU and time to play with MadVR ... load latest beta and try it ... see if it makes Justice League "pop"

Other than that ... its just your words against mine ... pointless ...
Logged
HTPC: Win11 Pro, MC: latest 31(64b), NV Driver: v425.31, CPU: i9-12900K, 32GB RAM, GeForce: 2080ti
Screen: LG 2016 E6
NAS: FreeNAS 11.1, SuperMicro SSG-5048R-E1CR36L, E5-1620v4, 64GB ECC RAM, 18xUltrastar He12-SAS3 drives, 2x240GB SSD (OS)
Pages: [1]   Go Up