I'm definitely not an expert with the newer nVidia cards, but I'll chime in with a couple of tidbits for you, Z0001.
Historically in my experience AMD provided better color, better support for the needed HTPC refresh rates, and better control over deinterlacing. I have AMD in my HTPC and everything works very well with MadVR. I would say AMD used to be preferable.
However, AMD's drivers have been getting worse. The removed deinterlacing controls. And they have issues with HDR.
Generally now people seem to think the higher horsepower NVidia cards run cooler and quieter than AMD, and that this gives an advantage if you want to use the expensive scaling algorithms at 4k.
As soon as you bring 4k into, things get dicier. I see very little difference between the cheap vs expensive scaling algorithms on a moving image at 4k. Also, a lot of people think they need 4k but can't actually see it because they sit too far from the set for the human eye to resolve it. So the 4k/horsepower issue might not really matter for you. But if you have a projector with a giant screen, or a big TV and sit very (too) close, 4k matters.
Lastly it's my understanding that currently only Nvidia cards are able to properly switch between HDR and SDR modes with MadVR, and for this reason Madshi only recommends Nvidia if you're doing HDR. I don't know if this has changed lately.
If you only have SDR content (quite possible) that won't matter to you. If you have a mix, it will. Personally I don't care. The HDR stuff I've seen made much much less of a difference to me than moving to an OLED television. That was a massive improvement. To me HDR is minor. I mention this because HDR/SDR switching might be the killer advantage of nVidia, but you need to let your own eyes tell you if it matters.
I'm interested in what others with more recent experience with new Nvidia cards think, but a lot of people have only used one brand and so can't really compare. I used to run Nvidia but have been AMD for a few years now.
Good luck...
-Will