Devices > Video Cards, Monitors, Televisions, and Projectors
100 % CPU load 8k HDR 60 FPS
Manfred:
If playing
https://www.youtube.com/watch?v=hVvEISFw9w0
locally with RO HQ i7 4C4T goes to 100% utilization whereas GPU on GTX 1070 8GB is only 21%.
File Type is .webm
movie is 3840x2160 HDR 1100 nits, BT2020->custom gamut, primaries BT.2020, chroma > Jinc AR, image < BC150 LLAR, P010, 10 bit 4:2:0
HDR in madVr is configured as: use pixel shader math
Why is CPU high and GPU low?
If playing same video direct in MS Edge from YouTube side: CPU: 21% GPU: 51%
Manfred:
I think I got it: GTX 1070 does not support 10 bit V9 (.webm) HW decoding. Only 8 bit.
Converting it with MC to H264 ->colours get washed out. Best playback is to use Edge.
tij:
I dont think thats it ... if your GPU did not support vp9 10bit ... playing video through Edge would have caused 100% CPU load too for decoding.
Since Edge does not use 100% cpu ... it is somehow using your GPU to decode it ...
Maybe MC is wrongly recognising your GPU as not capable of decoding VP9?
Manfred:
Ok-: Edge has something like madVR CTRL-J, it says its playing BT.709 which means that it has mapped 10bit ->8bit. 8bit/VP9 is supported by GTX1070. That would mean Edge has done some Colour and bitdepth mapping internally. Edge uses the GPU - one sees that in MS task manager.
So MC is correct.
tij:
No way edge can transcode vp9 without heavy cpu usage if hardware decoding is not available ... very likely it negotiated with yourube to receive h265 or h264.
According to this https://en.m.wikipedia.org/wiki/Nvidia_PureVideo
1070 supports 10 bit vp9 decoding (not encodimg)
So maybe MC is not utilizing this feature for greater compatibility? (Maybe AMD gpu dont support vp9 decodimg?) ... just shhoting in the dark here
Navigation
[0] Message Index
[#] Next page
Go to full version