Devices > Video Cards, Monitors, Televisions, and Projectors

Any interest moving to new Geforce 1070/1080 models ?

(1/3) > >>

Adhara:
Hi guys,

I need your opinion about doing the good choice for a graphic card (I own a GTX 970).
I heard Geforce 1070/1080 are coming...

Is there any advantage moving to them (except for HEVC hw decoding) ?

1 - Due to HDMI 2.0 limitation, 2160p50 or 60 with RGB 444 at 10 bits is not possible
2 - Even if 10 bits option is available in the Nvidia Control Panel, this options seems working only for pro cards (quadro)


--- Quote ---"Pro" cards is 10-bit OpenGL, however consumer cards can do 10-bit through Direct3D.
--- End quote ---


--- Quote ---NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI.
--- End quote ---

Looks like only ATI cards would produce a good 10 bits outptut.

You confirm these ?
From your point of view, what is the best and the bad between;

2160p60 - RGB 444 - 8 bpc
2160p60 - RGB 422 - 10 bpc
2160p60 - RGB 420 - 10 bpc


Maybe I should wait for the new ATI cards coming at the end of the month.
Is ATI do a better job than Nvidia for MadVr/Lav stuff ?

regards.

Matt:
I'm toying with a 1080 GTX, but I'm a silly gamer!

Adhara:

--- Quote from: Matt on June 20, 2016, 12:40:48 pm ---I'm toying with a 1080 GTX, but I'm a silly gamer!

--- End quote ---

hi Matt, you do not answer to my post ;-)

Hendrik:
10-bit works fine on ordinary Geforce Cards, and always has, Quadro is not required.
AMD/ATI is not going to provide anything "better", in recent months they have been plagued by driver troubles once again which went as far as crashing with ROHQ.

Which format you want to use is up to you. Nearly all consumer video content is 420 or at best 422 anyway, so with real world video content you will likely not see a difference when using 422.
With proper dithering, you'll likely also not see a difference with 8-bit however.

Guybrush:

--- Quote from: Matt on June 20, 2016, 12:40:48 pm ---I'm toying with a 1080 GTX, but I'm a silly gamer!

--- End quote ---

Do you have a 4k screen? I'm perfectly happy with my 980Ti for gaming, but when a full UHD bluray solution for PC is available, I'll have to find a way to upgrade for HDCP 2.2 (unless that solution includes redfox). That will be a pain because I have my 980Ti liquid cooled, so I guess I'll have to reattach the original cooler before I could sell it.

Navigation

[0] Message Index

[#] Next page

Go to full version