More > JRiver Media Center 29 for Windows

NEW: JRVR -- JRiver Video Renderer

<< < (41/76) > >>

davelr:

--- Quote from: Hendrik on November 29, 2021, 10:12:11 am ---More bits is more better. :)
10-bit carries more details then the default 8-bit. But its only meaningful if your display can actually receive and properly process 10-bit signals, and can be detrimental if turned on when your display is not capable of this, hence it being an advanced option.

--- End quote ---

Thanks Hendrick, I get the bit depth issue. My ignorance is related to HDR in that I'm just starting to play with it. I guess my question is what is the setting meant to actually do? Is it supposed to turn on OS HDR for an SDR source? When I've set it I can't tell that anything changes. I'm running Radeon Vega 11 graphics set at 10 bit through a Denon 4400 to an LG OLED B7. When I play a sample HDR file OS HDR gets turned on but I can't see anything happening with an SDR source. Please forgive if a stupid question.

Hendrik:
No, its not supposed to be noticeable. To make use of it, you would have to have your graphics card set to output 10-bit (or higher), and when 10-bit output is on and you go fullscreen, it would automatically make use of it without any notice.
You might be able to see it by comparing 10-bit gradient images, but at 8-bit we use high quality dithering, so it might only show up in a slight reduction in noise.

The reason its an option is that we don't know if your graphics card is set to output 10-bit, or if your screen will properly accept and process it. Until we can figure that out (which might be never), it has to be an option.

davelr:

--- Quote from: Hendrik on November 29, 2021, 12:44:54 pm ---...

The reason its an option is that we don't know if your graphics card is set to output 10-bit, or if your screen will properly accept and process it. Until we can figure that out (which might be never), it has to be an option.

--- End quote ---

Got it, thanks for clearing it up for me.

Ashfall:

--- Quote from: Hendrik on November 29, 2021, 12:44:54 pm ---No, its not supposed to be noticeable. To make use of it, you would have to have your graphics card set to output 10-bit (or higher), and when 10-bit output is on and you go fullscreen, it would automatically make use of it without any notice.
You might be able to see it by comparing 10-bit gradient images, but at 8-bit we use high quality dithering, so it might only show up in a slight reduction in noise.

The reason its an option is that we don't know if your graphics card is set to output 10-bit, or if your screen will properly accept and process it. Until we can figure that out (which might be never), it has to be an option.

--- End quote ---

For those of us who have an HDMI 2.0 graphics card drivers set to output 10-bit or 12-bit to the display, this is limited to lower framerates.  50/60 fps and the graphics card outputs 8 bit.  24/25 fps and the video driver changes back to 10-bit or 12-bit.  I have MadVR configured with profile rules so that fps determines whether it outputs 8-bit or 10-bit to the video driver to match the HDMI 2.0 constraints.  This way I get 10 bit from MadVR all the way to the display for SDR 24/25 fps sources.  Will JRVR have something similar?

JimH:
Nice summary of power usage by syndromeofdawn:
https://yabb.jriver.com/interact/index.php/topic,130954.msg910945.html#msg910945

Thanks!

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version