The HDR aspect of 4k Blu-Rays is touted as the most significant improvement, even above increased resolution, vs old 1080p media. Therefore I think it should be "darn obvious" and not a "maybe you can see it" type of experience. It is obvious come to think of it, the orange palette in the outside scenes in The Martian looked so intense that it painted my entire living room with a new hue.
The most talked of approach (so not scientific) seems to be 'get all color, forget precision'. So 4:4:4 8 bit and it can be pushed at 60Hz. But that's just not enough.
Because HDR10 is 10 bits and Dolby Vision is 12 bits. So staying with the numbers I'd say go 4:4:4 10 or 12 bits (if the panel/projector allows it) and stay stuck at 24Hz. Most content one would want to watch is 24p anyways (talking about only 4K stuff); demo files with other specs are only for 4k geeks. Any other files (1080p) at 60Hz - throw hands in the air. Or I don't know, use a second port on the device (and things get complicated).
My problem is that a lot of 4K TVs have been released with only ONE HDMI 2.0 port that allows HDR and stuff (and normal AVRs have only one HDMI out, unless one jumps above $500). So how the heck can the tv be calibrated for two different color specs on the same port? It detects HDR metadata and pushes the backlight to 10 but there's no logic to swap calibrated profiles depending on the feed. This is dumb.
And BTW, don't know about Europe but in US a decent 4K TV can be had for under $1000 now. Not the best, but not the worst either. I got mine around Thanksgiving 2015 without braking the bank. Only an OLED 65" costs $3000 (no comment on jmone's projector that ca be traded for a car
). TVs are affordable. Too bad true 4K content is way behind expectation (new movies coming out from 2K masters, what the heck?!), and the transporting spec is a mess.