I'd be interested if it made a noticeable difference in the video quality. I have no clue what the real-world differences would be.
I’m only interested in real world differences

It depends on your display/projector. On my Samsung S90C (QD OLED), even after a good manual calibration using the CMS, low stimulus levels are significantly undersaturated. This doesn’t show in standard calibration reports but it’s obvious if you run saturation sweeps at 5-15% stimulus. As a result, colors are understaturated in dark scenes. A HDR 3D LUT allows to restore the proper saturation (and brightness levels) at all stimulus levels, not just at the levels the CMS works at.
This improvement is very visible, at least here. Your display could have more visible errors in other areas.
If you tried to adjust these lower stim levels with the CMS, then all the higher stim levels that would be wrong, as the CMS isn’t linear.
SDR LUTs are an option with projectors because you can get the same peak brightness in SDR and HDR, so you can tonemap to BT2020 SDR, but with most HDR TVs your peak brightness is only available in HDR. For example, I get 1,360nits in HDR but only 500nits or so in SDR, so clearly using a SDR BT2020 3D LUT doesn’t work for me.
I’m going to post some before/after measurements to show this, but it really depends on your display and on your ability to make good profiles and generate good 3D LUTs.