These values would depend on the bitdepth, not the resolution. Thats because on 8-bit you have video black on 16, on 10-bit its on 64.
Hence using a decimal could work fine, as it gets multiplied by the depth - eg. 0.1 on 8-bit is 255 * 0.1 = 26 (post-rounding), or on 10-bit its 1023 * 0.1 = 102 - both being pretty much identical for their bitdepth.
The default value is actually 24.0/255.0 = 0.094, so if you don't specify anything at all, it would work fine on all bitdepths. Only if you specify a value > 1.0 you are locking yourself into a certain bitdepth, and that should be avoided. So 0.1 would be fine for all resolutions and bitdepths.
The real trouble starts when you are dealing with SDR vs HDR, because while SDR 0.1 is pretty dark, on HDR its quite a different beast.