INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: FR: Option in JRVR to use the TV as upscaler and to use Server for DSD->PCM  (Read 920 times)

Manfred

  • Citizen of the Universe
  • *****
  • Posts: 1023

JRVR has HDR passthrough - why not have an Option to let the TV do also the up-scaling (New Sony and LG TV's are very good in this area). If it also would be possible to let the server in a client server setup do the DSD to PCM processing one would have a very thin client.
Logged
WS (AMD Ryzen 7 5700G, 32 GB DDR4-3200, 2x2 TB SDD, LG 34UC98-W)-USB|ADI-2 DAC FS|Canton AM5 - File Server (i3-3.9 GHz, 16GB ECC DDR4-2400, 46 TB disk space) - Media Renderer (i3-3.8 GHz, 8GB DDR4-2133, GTX 960)-USB|Devialet D220 Pro|Audeze LCD 2|B&W 804S|LG 4K OLED )

voodoo5_6k

  • World Citizen
  • ***
  • Posts: 184

JRVR has HDR passthrough - why not have an Option to let the TV do also the up-scaling (New Sony and LG TV's are very good in this area). If it also would be possible to let the server in a client server setup do the DSD to PCM processing one would have a very thin client.
So, you're basically asking for what I called EVPM in my feature request. A single option to force MC to playback at the content's resolution in order to let an external video processor (Radiance Pro in my example, the TV in yours) handle the upscaling after the initial chroma upscaling. It is also one of my intentions to have a very thin client. So, I'd support this request, and I guess you'd support mine ;)

https://yabb.jriver.com/interact/index.php/topic,131919.0.html
Logged
END OF LINE.

jmone

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 14267
  • I won! I won!

Thinking out loud, I'm not sure this is a JRVR feature but rather a MC Player Feature. 

We already have a "Display Rate Changer" that lets MC instruct the GPU's Driver to switch the frequency to match that of the content.  It sounds like to are after a "Display Resolution Changer" that does a similar thing but based on the Media's Resolution.  It is better to be in the Player for a couple of reasons:
- It would work with any renderer (JRVR, EVR, madVR etc) without the need for any changes
- It would change the resolution PRIOR to commencing playback (renderers don't like when the resolution changes during playback)

There are a few things to consider:
- It would only match "Display Resolution" to that of the content's Luma resolution.  As Hendrik has explained, in PC's, the Chroma will always be scaled to equal Luma resolution. 
- What do you do for content where there is no corresponding resolution profile in the GPU (eg I have a bunch of 1440x1080 non-square pixel videos)?  These must be scaled anyway, so do you just scale to a "Default" profile, the closest etc?
- Given you are still going to have to scale Chroma and also mismatched Luma... you will still need a PC that can handle it.
- Will it really make a visible quality difference?
- How many people would use such a feature?


Logged
JRiver CEO Elect

voodoo5_6k

  • World Citizen
  • ***
  • Posts: 184

Thinking out loud, I'm not sure this is a JRVR feature but rather a MC Player Feature. 

We already have a "Display Rate Changer" that lets MC instruct the GPU's Driver to switch the frequency to match that of the content.  It sounds like to are after a "Display Resolution Changer" that does a similar thing but based on the Media's Resolution.  It is better to be in the Player for a couple of reasons:
- It would work with any renderer (JRVR, EVR, madVR etc) without the need for any changes
- It would change the resolution PRIOR to commencing playback (renderers don't like when the resolution changes during playback)
Hey jmone, thanks for your feedback :)

I'd also guess it is more of an MC feature rather than specific to a renderer. But to me it is not so important, where it may be implemented, rather than that it may be implemented ;) (That's why I also offered to support this feature implementation financially)

There are a few things to consider:
- It would only match "Display Resolution" to that of the content's Luma resolution.  As Hendrik has explained, in PC's, the Chroma will always be scaled to equal Luma resolution.

--> That would have to be accepted by us requestors. Therefore, the client should be capable enough to perform very good chroma upscaling (but only up to the content's resolution).

- What do you do for content where there is no corresponding resolution profile in the GPU (eg I have a bunch of 1440x1080 non-square pixel videos)?  These must be scaled anyway, so do you just scale to a "Default" profile, the closest etc?
--> Yeah, that's a thing that should be up to the user to decide/configure. In my case, I'd not scale luma on your example but keep it as non-square pixels. The Radiance Pro has dedicated aspect ratio buttons to switch an input's aspect ratio on the fly. I'd use these to dial in the video's aspect ratio.

- Given you are still going to have to scale Chroma and also mismatched Luma... you will still need a PC that can handle it.
--> Mismatched luma aside (see the point above), yes, but the PC could be lighter on hardware. A lot. Especially as chroma would also only be upscaled to the content's resolution, not to the intended screen/display resolution. I'd get rid of the dedicated GPU completely and go back to the iGPU. That's basically 1/4 or less of the original power budget. A system this small can be "piggy-backed" to a TV etc. Also, this needs much less cooling, and is way easier to keep quiet in the summer.

- Will it really make a visible quality difference?
--> That doesn't even matter that much (in my opinion). If the same quality can be reached, there are huge power savings. The Radiance Pro uses much less power than a dedicated GPU capable of driving madVR at high or highest settings. And in Manfred's example, it's even better as the TV is running anyhow to display the content.

- How many people would use such a feature?
-->
That is an interesting question. Initially, I'd say very few. Manfred's reason (correct me if I'm wrong) is that for his requirements the TV's scaling engine is good enough and he wants a thin client. Also, I guess he'd use the TV's scaling engine for all other sources too. My reason is to also have scaling engine that I consider good enough for my requirements (i.e. the Radiance Pro, although I'd test it before purchasing of course), and be able to use it on all sources, and then save power in the end too (because I'd then downsize my HTPC a lot). So, initially, only some customers would use it, those that have whatever external scaling device and want to use it on all their sources, and have all their sources as thin and efficient as possible. But potentially more users are possible. With TVs and also projector's becoming better at upscaling (and also tonemapping), there might be a point, when there is no real benefit anymore in wasting a 500W GPU on upscaling video. Then MC would have a solution ready at hand. Also, like I said in my feature request, this feature would allow to keep using MC (with all its benefits, like the library, TheaterView etc.) while "outsourcing" the heavy scaling elsewhere. And having done that, all other sources would gain through the external upscaler. Another point might be that the user would be able to use the enhanced CMS capabilities of the external upscaler, both for MC and all other sources. So, this feature "might" be an investment in the mid-term future of MC, already having gained its own lightweight and universal renderer... this new feature would be another step towards being a lightweight universal multimedia application, allowing people to "just connect it" to their existing home theater infrastructure (i.e. just plug it into your TV, HDMI switch, EVP etc. and set for what I called "EVPM", done ;)). Initially, it could of course also be a "paid upgrade", like a "plus" version of MC, for those folks interested. But that's a different story.
Logged
END OF LINE.
Pages: [1]   Go Up