JRVR is outputting 10-bit to the graphics driver. If the driver/GPU is then reducing the image to 8-bit because the display connection is only 8-bit is opaque to us. If your display connection is only 8-bits, I would recommend to not enable 10-bits in JRVR.
If you can't use a 10-bit mode with that projector though, something is not supporting HDMI 2.1. With that projector spec, I would assume the projector does, so perhaps your AVR, or some other HDMI device, or even a cable? A 3080 is new enough to support it.
The JVC NZs definitely support full HDMI 2.1 at 48Gb/s (I have an NZ8) so can do 8K60 in 10bits, but only if the EDID is set to EDID A in the PJ and, as you say, if the cables and all devices in between (switch, AVR) are up to scratch (no many are, for example many switches and AVRs only support 40Gb/s, so wouldn't be able to handle 8K60 in 10/12bits). Otherwise the connection is most likely restricted to 8bits by either the cables or one or more device in the HDMI chain.
I personally don't see any advantage in using 8K with the JVC NZs for video playback use. There is a small advantage in latency if you're gaming, but that's about it. Even with gaming, 4K120 is a better option most of the time, and for video you get excellent results with E-shift X to get 8K on screen without having to carry such an unnecessary huge bandwidth, given that the content itself is in 4K. Unlike E-shift that was in between 2K and 4K as it only had two subframes, EshiftX is true 8K as it upscales using 4 subframes, so it can address all the 8K pixels indivually.
So my advice would be to go back to 4K23 to 4K60 in 10/12bits for video content. This lessens the load on cables and makes it easy to switch to 4K120 with gaming. I have far less issues and much better performance that way.