Hi guys,
I need your opinion about doing the good choice for a graphic card (I own a GTX 970).
I heard Geforce 1070/1080 are coming...
Is there any advantage moving to them (except for HEVC hw decoding) ?
1 - Due to HDMI 2.0 limitation, 2160p50 or 60 with RGB 444 at
10 bits is not possible
2 - Even if 10 bits option is available in the Nvidia Control Panel, this options seems working only for pro cards (quadro)
"Pro" cards is 10-bit OpenGL, however consumer cards can do 10-bit through Direct3D.
NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI.
Looks like only ATI cards would produce a good 10 bits outptut.
You confirm these ?
From your point of view, what is the best and the bad between;
2160p60 - RGB 444 - 8 bpc
2160p60 - RGB 422 - 10 bpc
2160p60 - RGB 420 - 10 bpc
Maybe I should wait for the new ATI cards coming at the end of the month.
Is ATI do a better job than Nvidia for MadVr/Lav stuff ?
regards.