Would AMD/Intel make any difference for HDR handling, e.g. AMD Ryzen 9 3900X vs Intel Core i9-9900K?
Reason HDR needs to be “handled” - it’s because none of current display technology can fully cover its color nor brightness range. This is especially true for projectors (except maybe Christie Eclipse ... but that it’s out of reach for most of us).
So you need to map HDR content into color/brightness range that your display can handle. This process is called tone mapping. And of course their will be sacrifices in quality. What is even worse - there is no standard for doing this.
You have basically 3 choices where to do tone mapping - let display do it (if it’s rated HDR), let special box do it (Lumagen or MadVR Envy) or let PC do it via MadVR
Second option is expensive. First option usually is lowest quality (most of displays have simple tone mapping limited by its processing power that often introduce color/hue shifts).
Last option is what most ppl use if they go HTPC way. But you need powerful GPU for it as MadVR uses GPU for both scaling and tone mapping.
I cannot say for other GPUs ... but 1070 handles HDR fine with scaling set to very high lvl (not maximum though) ... only issue I have is 4K 60fps content (Gemini man and Billy Lynn) ... for those titles I have to turn off setting that preserve hue/color ... so possible there are color/hue shifts there but I have not noticed it
With MadVR sky is the limit ... more money you spend on GPU ... higher settings you can crank up ... there is diminishing return though ... minimum I would go is 1060 equivalent ... when I have money will switch to 2080ti ... but atm I am pretty happy with 1070
EDIT: CPU is not a bottleneck playing movies on the same machine. It becomes bottleneck if you transcoding video to be played on mobile devices. It’s also bottleneck if you converting DSD audio