That's not correct, it's actually the opposite, I think. sRGB will make dark tones appear brighter than gamma 2.2.
Yes, it would, however, that is true if you are talking about the screen. The video renderer applies it in inverse to make the video signal match the processing of the screen. Which is why 2.4 looks brighter, even though calibrating a screen to 2.4 would make it darker (and both combined be neutral). Thats why the gamma option feels backwards, because it is. It does the inverse of what your screen would do, 2.4 gets darker on the screen, so the image it gets delivered is brighter.
Gamma in the video renderer here is not about changing the image, its about spending bits on the parts that matter. Hence doing the inverse of the screen behavior, so the image stays neutral.
If you have a screen that supports various gamma curves, swaping it between them and changing JRVR accordingly should hopefully produce a relatively consistent image. Of course without a meter determining that in the analog world is always a bit tricky, as I myself can barely tell the difference if I can't swap between them very quickly. (and my stupid gaming screen supports 1.9, 2.2, and 2.5, who picked those, the other screens have no options, cba to check the TV right now)
Note that sRGB practically never applies to video. You might have a PC screen calibrated to it, but the video you are playing is not mastered to sRGB, and its never used internally unless specifically instructed to as the output gamma encoding.
There is no "in-memory" sRGB values, internally between steps its either left as-is or converted to linear light as needed. Its decoded to linear light with the source transform, eg. BT.1886, and re-encoded to gamma light either with the same, or with whatever you configured as an override.
"disabled" is in most cases equal to BT.1886. sRGB might just look similar due to a similar gamma response. You can actually see the entire conversion disappear from the OSD when its set to either disabled or BT.1886
I'll add one more detail: I imagine JRVR is making use of the DirectX VideoProcessor for hardware accelerated decoding and deinterlacing. I know that its output is a buffer with sRGB encoded values (well, encoded with the sRGB inverse function, since sRGB is a EOTF, not a OETF).
The video processor works in YCbCr and only performs deinterlacing, it does not take or output RGB. No further processing is enabled, and its also only used at all on interlaced videos.
It does quite certainly not re-interpret the gamma of the video, because its not even told the gamma. If anything it would just pass it through because it has no basis to know the input and output would be different, or would be supposed to be.
If you disable hardware video decoding, then no DirectX video functions are even touched at all, and I'm also quite confident that the decoder has no impact on the image even if its used, that would've been noticed in years of using hardware decoding. There is plenty of pixel-peeping people using LAV Filters etc.
If the intent was for the user to just set Gamma Processing to the gamma value of his display, and then he will see the video as if his display was configured to BT.1886 or gamma 2.4, then that's not what's happening.
The intent is for the user to not touch that value, and if they want to adapt the reproduction to their screen, measure it with a meter and produce a ICC profile, since basically no screen has a perfect gamma response.
Without measuring the final response, all you can do is set it to feel. How can you be sure its not close to look like its supposed to?
What the setting does is encode the linear light using that gamma function. Setting it to the same that the screen will use to decode it results in you getting the same linear RGB that the image contained originally, which is most commonly needed if your screen is not using the same gamma function as the video - which might be most often true for PC screens, as TVs are typically going to use BT.1886, or perhaps 2.4 (which on an OLED would be close to equal to BT.1886 due to the infinite contrast)
So yes, what that option is supposed to do is make the image look like you are watching it on a BT.1886 screen with a typical 1000:1 contrast ratio (which is much closer to 2.2 then 2.4). Its not designed to change the image, just adapt it to your screen (hence the calibration section). This might not entirely match an OLED however, as even if it uses BT.1886 it would probably plug its own contrast in there, which if truely infinite is just 2.4, and not quite what SDR mastering is typically targeting.
--
One bit of confusion might come from the naming of the option, especially if you used eg. madVR before. This JRVR option basically corresponds to "the display is calibrated to the following transfer / gamma" in the madVR settings, except that JRVR will actually re-shape the gamma of the video to match that, something madVR does not do. madVR has a separate "gamma processing" option which does what you want, eg. adjust the gamma in such a way that it emulates the desired gamma curve on your screen, which is not an option we currently have. You can try to cheat it by settings the gamma processing to a "wrong" value, or do it with an ICC profile or a 3DLUT.
Our default behavior matches madVR fwiw, that is not touching gamma at all, and just sending it to the display as it was in the original video - which is what "disabled" gives you.
It was actually originally named "display gamma" or something like that, but to discourage actually setting it without really understanding it, I renamed it and introduced the "disabled" option, so that gamma is passed through, which in most cases is the best option, as thats what people expect to happen.