INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: 8bit or 10bit, what is the truth?  (Read 599 times)

murray

  • Citizen of the Universe
  • *****
  • Posts: 544
8bit or 10bit, what is the truth?
« on: September 19, 2023, 07:11:26 pm »

I cant tick 10bit in my 3080 video card as only 8bits show, however on the playback stats it shows I have 10bit in the output. I have 10bit in JRVR ticked. Are the stats telling my the truth that I really are outputting 10bit or is it really not the truth?

Display JVC NZ9, 8K/ 60p/ 4K120p.

Logged

davewave

  • Regular Member
  • Junior Woodchuck
  • **
  • Posts: 55
Re: 8bit or 10bit, what is the truth?
« Reply #1 on: September 19, 2023, 08:48:17 pm »

Just out of curiosity, what is your monitor?   As I understand it, for example, most Sony TVs (if not all) are only 8 bit.   Sony tries to obfuscate the issue when asked.  If your TV can't handle 10 bit, every other indicator could be misleading.
Logged

murray

  • Citizen of the Universe
  • *****
  • Posts: 544
Re: 8bit or 10bit, what is the truth?
« Reply #2 on: September 19, 2023, 08:52:25 pm »

Just out of curiosity, what is your monitor?   As I understand it, for example, most Sony TVs (if not all) are only 8 bit.   Sony tries to obfuscate the issue when asked.  If your TV can't handle 10 bit, every other indicator could be misleading.
I have the new JVC NZ9, 8K/ 60p/ 4K120p.
Logged

Hendrik

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 10724
Re: 8bit or 10bit, what is the truth?
« Reply #3 on: September 20, 2023, 02:35:05 am »

JRVR is outputting 10-bit to the graphics driver. If the driver/GPU is then reducing the image to 8-bit because the display connection is only 8-bit is opaque to us. If your display connection is only 8-bits, I would recommend to not enable 10-bits in JRVR.

If you can't use a 10-bit mode with that projector though, something is not supporting HDMI 2.1. With that projector spec, I would assume the projector does, so perhaps your AVR, or some other HDMI device, or even a cable? A 3080 is new enough to support it.
Logged
~ nevcairiel
~ Author of LAV Filters

Manni

  • Galactic Citizen
  • ****
  • Posts: 340
Re: 8bit or 10bit, what is the truth?
« Reply #4 on: September 27, 2023, 11:30:45 am »

JRVR is outputting 10-bit to the graphics driver. If the driver/GPU is then reducing the image to 8-bit because the display connection is only 8-bit is opaque to us. If your display connection is only 8-bits, I would recommend to not enable 10-bits in JRVR.

If you can't use a 10-bit mode with that projector though, something is not supporting HDMI 2.1. With that projector spec, I would assume the projector does, so perhaps your AVR, or some other HDMI device, or even a cable? A 3080 is new enough to support it.

The JVC NZs definitely support full HDMI 2.1 at 48Gb/s (I have an NZ8) so can do 8K60 in 10bits, but only if the EDID is set to EDID A in the PJ and, as you say, if the cables and all devices in between (switch, AVR) are up to scratch (no many are, for example many switches and AVRs only support 40Gb/s, so wouldn't be able to handle 8K60 in 10/12bits). Otherwise the connection is most likely restricted to 8bits by either the cables or one or more device in the HDMI chain.

I personally don't see any advantage in using 8K with the JVC NZs for video playback use. There is a small advantage in latency if you're gaming, but that's about it. Even with gaming, 4K120 is a better option most of the time, and for video you get excellent results with E-shift X to get 8K on screen without having to carry such an unnecessary huge bandwidth, given that the content itself is in 4K. Unlike E-shift that was in between 2K and 4K as it only had two subframes, EshiftX is true 8K as it upscales using 4 subframes, so it can address all the 8K pixels indivually.

So my advice would be to go back to 4K23 to 4K60 in 10/12bits for video content. This lessens the load on cables and makes it easy to switch to 4K120 with gaming. I have far less issues and much better performance that way.
Logged

murray

  • Citizen of the Universe
  • *****
  • Posts: 544
Re: 8bit or 10bit, what is the truth?
« Reply #5 on: September 27, 2023, 01:56:35 pm »



So my advice would be to go back to 4K23 to 4K60 in 10/12bits for video content. This lessens the load on cables and makes it easy to switch to 4K120 with gaming. I have far less issues and much better performance that way.
If a person only has 8 and 12bit showing on the video card can they tick just 12bit and enable the 10bit tick in HDR JRVR?
Logged

Hendrik

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 10724
Re: 8bit or 10bit, what is the truth?
« Reply #6 on: September 27, 2023, 01:57:39 pm »

Thats fine. Over HDMI its pretty normal to only have 12-bit offered.
Logged
~ nevcairiel
~ Author of LAV Filters
Pages: [1]   Go Up