So, I came back to JRiver/JRVR after a few weeks playing again with MPC-BE/madvr.
I noticed that, in NVCP, the "power managment mode" MUST be set to "prefer maximum performance" when using JRiver, otherwhise I get random dropped frames that might or might have not existed a couple of month ago (maybe I just didn't notice at the time... but I doubt that
).
I have a RTX 3090 which should be able to swallow any JRVR setting. Yet, when the NVCP "power managment mode" setting is set to "adaptive", JRVR render times are way (WAY) higher.
Take a 2160p23 DV file on a HDR 4K screen (LG C2). No luma upscaling (obviously), HDR passthrough (so no DTM), audio bitstreaming. JRVR Settings are default "quality preset" + adaptive sharpen 50.
With "adaptive" : 10ms avg / 20ms to 30ms peak (on OSD... probably a bit higher, of frames wouldn't drop, would they?...)
With "Maximum performance" : 1.4ms avg / 1.5-2.5ms peak
I don't know if it's a driver bug, a JRVR bug, or a nvidia power management "feature". In my untrained eyes, it feels that, even with "adaptive" setting, the GPU should switch to a "performance" state when JRiver is on and actually using GPU power. But it's probably way more complicated than that from a dev point of view
I still felt it was worth reporting though
EDIT : okay, so it's a super-known issue
https://yabb.jriver.com/interact/index.php/topic,136395.msg944765.html#msg944765Oh well, now there's one more post for dummies like me to find a solution