INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: How do they interact: Intel CPU/GPU, External GPU and madVR  (Read 3539 times)

elo

  • Junior Woodchuck
  • **
  • Posts: 87
How do they interact: Intel CPU/GPU, External GPU and madVR
« on: March 30, 2019, 07:30:29 am »

I find it very difficult to understand the Video processing in MC for UHD material to such a level that I can dimension a 4k system.The unclear points is mainly how the processing tasks are controled and split between different HW components in the system. Can for instance the decding be done by the GPU on the CPU and the rendering be done on a external GPU?.  I am about to upgrade my system to a true 4k system which involves changing out nearly all components in the chain (except power amplifiers and loudspeakers) and today there is several possible system layouts to consider. Currently I am using a high end “surround” processor/amplifier which acts as a video switch and sound decoder taking on all video sources , and pass them on to monitor and loudspeakers. The new smart TVs offers  another possible layout namely using the smart TV as the videoswitch (Samsung QLEDS) and using the ARC (audio return channel) for the sound. These TVs  have a input box with all connections that is connected to the TV through a thin cable containing fibers and power supply. Further these TVs have built in apps for the most popular streaming sources (much like the Apple TV). Sources not covered by dedicated Apps can be played through Chromcast Ultra or similar.

A modern 4k system must be able to handle several videosources  which will offer freedom in choice of overall system layout:

•   Streaming sources like Netflix HBO etc. These sources do NOT offer true 4k at the moment but should dimension the system to a bitrate of ~20Mb/s
•   TV Stations offering 4k material much like above
•   Riped UHD material from Blue Rays “uncompressed”
•   Compressed through encoders like h.254, h.256, VP9, VT1 etc..
•   All legacy material

Kerby Lake and Cofee Lake (Coffee Lake has the same graphical core as Kerby Lake) Intel CPUs has a inbuilt GPU which offers HW accelerated decoding  as in  the attachement bellow.

There is HW support for image processing functions such as De-interlacing, Film cadence detection, Advanced Video Scaler (AVS) detail enhancement, image stabilization, gamut compression, HD adaptive contrast enhacement, skin tone enhancement, total colour control, Chroma de-noise, SFc pipe, memory compression, , Localized Adaptive Contrast Enhancement (LACE), spatial de-noise, Out -of-loop De-blocking (from AVC decoder), 16bpc support for de-noise/de-mosaic. Support for Hardware assisted Motion Estimation engine for AVC/MPEG2 encode, True motion, and Image stabilization applications.

The HW video processing is exposed by the graphics driver using:

•   Direct3D* 9 Video API (DXVA2)
•   Direct3D 11 Video API
•   Intel Media SKD
•   MFT (Media Foundation Transfrom) filters
•   Intel CUI SSK

There is also support for HW Accelerated Transcoding.

It seems to me that the 9th Gen Intel Coffe Lake on a motherboard that supports HDMI 2 through a DP 1.4 to HDMI 2 conversion chip would have the nessesary funcionality to support MC’s video playback system based on madVDR rendering. But does it have the nessesary computational power? I would like someone with bettere expertise than me to comment on this?

If we look at external GPUs Nvidia 1660 upwards they ar gigantic in size and power consumption. I am not at all interested in gaming an only need a system capable of playing all videomaterial thrown at it. It seems to be to be an overkill to use 2060 GPU alongside Intels 7th generations CPU onwards.

From the 8th gen if Intels CPUs (Kerby Lake) with onchip GPU there is HW support for H.264, H.265, HDR etc. (4k) My questions are then @madshi:

With a mother board with Kerby Lake CPU and a “weak” GPU like Nvidia 1050 (exists in a passively cooled version), can the decoding (4k) be done in hardware on the CPU (rather its GPU) and the rendering on the external GPU and thus utlizing the HW capabilities to its maximum? (Controled by MC24 and madvr with monitor is connected throug the HDMI connection at the external GPU)

Further using a 9th generation Intel CPU on a motherboard with HDMI 2. Is this strong enough to perform high level rendering and decoding of H.265 coded video?  If yes this opens up for the last generation Intel NUC to offer a bvery small and effective HTPC solution ONLY for media playback (no gaming).

Can these task be placed and controlde through the set up interface of madVR?
Logged

IAM4UK

  • World Citizen
  • ***
  • Posts: 242
Re: How do they interact: Intel CPU/GPU, External GPU and madVR
« Reply #1 on: April 04, 2019, 12:06:06 pm »

I have the same questions. Thanks for putting this summary of considerations here; I hope someone can give a targeted answer...it would be a valuable remedy for the "clutter" of information about this topic.
Logged

elo

  • Junior Woodchuck
  • **
  • Posts: 87
Re: How do they interact: Intel CPU/GPU, External GPU and madVR
« Reply #2 on: April 09, 2019, 03:44:33 pm »

Is there no one that can shred a little light on this???
Logged

Manfred

  • Citizen of the Universe
  • *****
  • Posts: 1038
Re: How do they interact: Intel CPU/GPU, External GPU and madVR
« Reply #3 on: April 10, 2019, 11:19:47 am »

Nvidia decode/encode matrix:

https://developer.nvidia.com/video-encode-decode-gpu-support-matrix

intel:

https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video#Hardware_decoding_and_encoding

and in the intel doc's:

https://www.intel.com/content/www/us/en/products/docs/processors/core/8th-gen-core-family-datasheet-vol-1.html

My PoV: Every current Intel CPU with iGPU enabled will not smoothly play back all different type of video content like 1080i upscaled to UHD with RO HQ. If you want this you need an NVIDIA GPU.

I have an i3+ GTX 960. I can play back all type of videos including UHD HDR material but sometimes GPU is >90%. If I would buy a new system it would be >=4C/4T and one of the newer GTX/RTX models.
Logged
WS (AMD Ryzen 7 5700G, 32 GB DDR4-3200, 8=2x2+4 TB SDD, LG 34UC98-W)-USB|ADI-2 DAC FS|Canton AM5 - File Server (i3-3.9 GHz, 16GB ECC DDR4-2400, 46 TB disk space) - Media Renderer (i3-3.8 GHz, 8GB DDR4-2133, GTX 960)-USB|Devialet D220 Pro|Audeze LCD 2|B&W 804S|LG 4K OLED )

RoderickGI

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 8186
Re: How do they interact: Intel CPU/GPU, External GPU and madVR
« Reply #4 on: April 10, 2019, 04:29:09 pm »

Are you following this thread? https://yabb.jriver.com/interact/index.php/topic,120104.msg830150.html#msg830150
Note this post: https://yabb.jriver.com/interact/index.php/topic,117670.msg830263.html#msg830263

Quote
Can for instance the decding be done by the GPU on the CPU and the rendering be done on a external GPU?.

No. This was asked and answered recently somewhere on the forum. Hendrik may have provided the answer. A search might find it.

Quote
...using the ARC (audio return channel) for the sound...

If you are considering using the ARC for all audio, check the specifications very carefully, and then do more research to confirm them. You would want to be able to pass audio through the TV without any processing. My experience with ARC is that the TV may process the audio, re-encoding it, or even down-converting it to stereo. Also, there will be audio lag to adjust for in MC.

nVidia is still the prefered hardware.
Logged
What specific version of MC you are running:MC27.0.27 @ Oct 27, 2020 and updating regularly Jim!                        MC Release Notes: https://wiki.jriver.com/index.php/Release_Notes
What OS(s) and Version you are running:     Windows 10 Pro 64bit Version 2004 (OS Build 19041.572).
The JRMark score of the PC with an issue:    JRMark (version 26.0.52 64 bit): 3419
Important relevant info about your environment:     
  Using the HTPC as a MC Server & a Workstation as a MC Client plus some DLNA clients.
  Running JRiver for Android, JRemote2, Gizmo, & MO 4Media on a Sony Xperia XZ Premium Android 9.
  Playing video out to a Sony 65" TV connected via HDMI, playing digital audio out via motherboard sound card, PCIe TV tuner

rec head

  • Citizen of the Universe
  • *****
  • Posts: 1012
Re: How do they interact: Intel CPU/GPU, External GPU and madVR
« Reply #5 on: April 18, 2019, 07:36:39 am »

My experience is that for running MadVR the GPU is much more important than the CPU. I have an old i5 with a 1060 6GB video card and can run UHD without a problem. I think that the upscaling from 1080 or lower to UHD is what really takes the power. One thing I am certain of is that MadVR upscales 1080 way better than my TV and it is worthwhile to have a machine that can do it.

As to using the TV as a video switch I would advise against it. I don't think ARC can handle True HD, ATMOS or the DTS equivalents. All content that gets sent through the TV then out via ARC will probably get downgraded. Your high end "surround" processor will be much better suited.
Logged
Pages: [1]   Go Up