INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: Min CPU/GPU to support .... ?  (Read 4834 times)

mattkhan

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 4226
Min CPU/GPU to support .... ?
« on: January 08, 2014, 04:57:11 am »

I am currently running a server with i3-4340 (dual core haswell + HD4600 igp) and have a few things that I'd like to support but don't appear to have the horsepower to do, I am wondering what the recommended hardware would be to support....

1) deinterlacing 1080i with ROHQ (madvr)
- I see CPU utilisation at about 50% and GPU similar (GPU is not loading up to full clock speed either) yet v jerky playback
- it is not obviously gpu or cpu bound so I'm not sure what the issue here is

2) streaming BDs to 2 gizmo clients and also playing back on the server
- streaming a BD to a single gizmo client takes CPU to ~70% and streaming a different BD to a 2nd client maxes the CPU and so things become unstable
- I understand gizmo streams are always transcoded serverside and it appears cpu bound, is that correct?

3) using convolution (generated by acourate if that matters)
- I realise this might be a how long is a piece of string question...

Thanks
Matt
Logged

mwillems

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 5234
  • "Linux Merit Badge" Recipient
Re: Min CPU/GPU to support .... ?
« Reply #1 on: January 08, 2014, 08:41:00 am »

I am currently running a server with i3-4340 (dual core haswell + HD4600 igp) and have a few things that I'd like to support but don't appear to have the horsepower to do, I am wondering what the recommended hardware would be to support....

1) deinterlacing 1080i with ROHQ (madvr)
- I see CPU utilisation at about 50% and GPU similar (GPU is not loading up to full clock speed either) yet v jerky playback
- it is not obviously gpu or cpu bound so I'm not sure what the issue here is

It will depend on what your MadVR scaling settings are; even with an i7 and a nvidia 550 Ti, I couldn't run MadVR on maximum settings without jerkiness.  If you're getting jerky playback, regardless of the utilization information, it's probably GPU or CPU bound (more likely GPU than CPU). With my old setup I got dropped frames and stutter, but my video card utilization only showed around 65%; upgrading the video card (to a 660 Ti) eliminated the jerkiness even on the highest settings.  

My advice would be to try choosing less intensive scaling algorithms until you find one that will play without issues, I think some folks with similar hardware to yours have gotten ROHQ working with the less GPU-intensive scaling algorithms (softcubic or bilinear).  There's more info on the scaling algorithms here: http://yabb.jriver.com/interact/index.php?topic=80253.0

Quote

3) using convolution (generated by acourate if that matters)
- I realise this might be a how long is a piece of string question...

Thanks
Matt

Convolution will depend entirely on the length/complexity of the convolution filter.  If you have a way to generate a test filter, JRiver's convolution module will tell you how it's performing with that filter (20x real time, 80x real time, etc.).  My experience has been that when it starts dropping below 10 or 11 times real time, things can get less than ideal.  But it's hard to know how your processor will cope without some testing and/or knowing how long the filter is.  This will also change drastically if you're doing multi-channel rather than stereo.  You could use a free convolution filter program like RePhase to generate some notional convolution filters of varying lengths as a sort of a "proof of concept," if you don't have acourate yet.

I have no idea on your second question, but I bet there are other folks who know the answer  ;D
Logged

mattkhan

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 4226
Re: Min CPU/GPU to support .... ?
« Reply #2 on: January 08, 2014, 09:00:35 am »

OK thanks. I have it on Lanczos 3 tap + Anti-ringing atm as per these suggested settings. Which component is actually doing the deinterlacing here?

On the convolution side, I have acourate already and will be using it for MC and stereo playback. I haven't got it setup yet though... I need to get the PQ ok first.
Logged

mwillems

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 5234
  • "Linux Merit Badge" Recipient
Re: Min CPU/GPU to support .... ?
« Reply #3 on: January 08, 2014, 09:08:47 am »

OK thanks. I have it on Lanczos 3 tap + Anti-ringing atm as per these suggested settings. Which component is actually doing the deinterlacing here?

Lanczos scaling is one of the more aggressive (read GPU-intensive) algorithms, you should probably try one of the less intensive scaling algorithms.  I'm not sure why you might be getting different results than they did in the article, but they may have configured their test rig differently, or it may be that they were testing it on an i7 system (although CPU performance is not normally super relevant for MadVr performance).

As for the deinterlacing, I think it's settings dependent, but I believe that MadVr should be doing it with the default settings.  6233638 would know more about that I think. 
Logged

Manfred

  • Citizen of the Universe
  • *****
  • Posts: 1038
Re: Min CPU/GPU to support .... ?
« Reply #4 on: January 08, 2014, 04:16:40 pm »

I have the same issues than you mattkhan with my Thinkpad Intel Core i5-3320M @ 2.60GHz, HD 4000, 8 GB RAM or my older PC having Intel Core2 Duo E6600 @ 2.4 GHz, 6 GB RAM, GeForce 8800 GTS, 65 W TDP, 2 cores, 2 Threads - as described below in one of my messages in interact.

http://yabb.jriver.com/interact/index.php?topic=54396.msg578836#msg578836

On both hardware I could play any mp4 video, even in 1920x1080 using Red October HQ with Lanczos processing with very minor issues on the old PC. Same is true for DVD playback.

But not for Blu Ray (playing from the optical disk or from my NAS)!

BD play back works on both hardware using Red October Standard.

I have also seen very high CPU utilisation during BD playback. BD playback runs not smooth using Red October HQ. Also if changing image processing to bi-linear or other options available in madVR makes not so much difference on the CPU utilisation.

On the Thinkpad playback is better than on the old PC having less CPU utilisation. I did not expect that, because the older PC had the better GPU.

My personal feeling is, reading most on the topic in interact, madVR is heavily multithreaded and may be it is more important to have a higher performance per thread and more cores than having a better GPU?

If users having an i7-4770T or an i5-4670T (both quad core) with HD4600 or similar intel processors, it would be interesting to know if they have the same issues with BD playback and very high CPU utilisation.

It is for me an unsolved issue.

Sorry, that did not help you, but I wanted to mention that I have similar issues.
Logged
WS (AMD Ryzen 7 5700G, 32 GB DDR4-3200, 8=2x2+4 TB SDD, LG 34UC98-W)-USB|ADI-2 DAC FS|Canton AM5 - File Server (i3-3.9 GHz, 16GB ECC DDR4-2400, 46 TB disk space) - Media Renderer (i3-3.8 GHz, 8GB DDR4-2133, GTX 960)-USB|Devialet D220 Pro|Audeze LCD 2|B&W 804S|LG 4K OLED )

mattkhan

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 4226
Re: Min CPU/GPU to support .... ?
« Reply #5 on: January 08, 2014, 06:10:41 pm »

I was under the impression that madvr is essentially a directx game and hence is GPU dependent, I haven't researched it in detail though admittedly. It seems quite hard to get a clear view on what the scaling factors really are here though which is frustrating.

Further testing tonight shows that video mode deinterlacing is what kills my machine. Forcing madvr into film mode let's me use lanczos 3 tap + AR at 1080i30 and it runs smoothly.  These settings are fine for a 1080i mpeg4 BD (Adele at Albert Hall), a 1080p mpeg4 BD (Wall-E) and a DVD (Tangled) but fail on a VC-1 1080p BD (The Breakfast Club). The last one is a but of a mess as lip sync is off even with video clock on and jriver handling the resolution switch. If I turn hardware decoding on then this disc is a macro blocking mess as well as bad lip sync.

I do have an i5-4570 in another machine which I could swap over. I may do that if I can't get this stable in the next week or so (unless I get a clear steer that gpu is the way to go that is)
Logged

mwillems

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 5234
  • "Linux Merit Badge" Recipient
Re: Min CPU/GPU to support .... ?
« Reply #6 on: January 08, 2014, 07:28:24 pm »

I was under the impression that madvr is essentially a directx game and hence is GPU dependent, I haven't researched it in detail though admittedly. It seems quite hard to get a clear view on what the scaling factors really are here though which is frustrating.

Further testing tonight shows that video mode deinterlacing is what kills my machine. Forcing madvr into film mode let's me use lanczos 3 tap + AR at 1080i30 and it runs smoothly.  These settings are fine for a 1080i mpeg4 BD (Adele at Albert Hall), a 1080p mpeg4 BD (Wall-E) and a DVD (Tangled) but fail on a VC-1 1080p BD (The Breakfast Club). The last one is a but of a mess as lip sync is off even with video clock on and jriver handling the resolution switch. If I turn hardware decoding on then this disc is a macro blocking mess as well as bad lip sync.

I do have an i5-4570 in another machine which I could swap over. I may do that if I can't get this stable in the next week or so (unless I get a clear steer that gpu is the way to go that is)

You're right: the GPU is the most important thing for MadVr performance, the CPU isn't normally a significant contributor, but a much too slow CPU can bottleneck things. Your CPU is modern; the only reason I mentioned it above was that it was one of the differences between the reviewer's system and yours (i.e. I was trying to identify reasons they might have been getting different results with the same GPU).

It sounds like Lanczos works well on a HD4600 for some content, and not so well for other content, which is an easier explanation for why you were getting different results than the reviewer.  I wouldn't suggest swapping in the i5 without doing more testing.  For example, did you try switching to a less intensive scaling algorithm to see if you could get smooth performance?  

If everything works smoothly with a less intense scaling algorithm you probably need a more powerful GPU.  If you can't get smooth playback even with the lowest settings (or with Red October standard), there might be something else limiting it besides the GPU (like the CPU).  
Logged

glynor

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 19608
Re: Min CPU/GPU to support .... ?
« Reply #7 on: January 08, 2014, 11:17:10 pm »

You're right: the GPU is the most important thing for MadVr performance, the CPU isn't normally a significant contributor, but a much too slow CPU can bottleneck things. Your CPU is modern; the only reason I mentioned it above was that it was one of the differences between the reviewer's system and yours (i.e. I was trying to identify reasons they might have been getting different results with the same GPU).

It sounds like Lanczos works well on a HD4600 for some content, and not so well for other content, which is an easier explanation for why you were getting different results than the reviewer.  I wouldn't suggest swapping in the i5 without doing more testing.  For example, did you try switching to a less intensive scaling algorithm to see if you could get smooth performance?  

If everything works smoothly with a less intense scaling algorithm you probably need a more powerful GPU.  If you can't get smooth playback even with the lowest settings (or with Red October standard), there might be something else limiting it besides the GPU (like the CPU).  

+1 to everything that guy said.
Logged
"Some cultures are defined by their relationship to cheese."

Visit me on the Interweb Thingie: http://glynor.com/

mattkhan

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 4226
Re: Min CPU/GPU to support .... ?
« Reply #8 on: January 09, 2014, 03:18:44 am »

The odd thing is that the VC-1 encoding is perfectly smooth & the madvr stats show that the total time spent in well within budget so I don't think it's a q of tuning madvr further, it's just not lip synced. I think I need to find some other VC-1 discs to see if it's a one off or not.
Logged

Manfred

  • Citizen of the Universe
  • *****
  • Posts: 1038
Re: Min CPU/GPU to support .... ?
« Reply #9 on: January 12, 2014, 12:46:46 pm »

Hi,

I used this post because I made a comment to the original message.

OK I sorted out my problem now and understand why I could not play BD using Red October HQ. I have done some performance tests today with different content on DVD and BD and using GPU-Z to measure GPU usage:

Simple conclusion for Red October HQ my PC (Intel (R) Core(TM)2 CPU 6600 @ 2.4 GHz, 6 GB  RAM 800 MHz | NVIDIA GeForce 8800 GTS 320 MB  | Win 7 64 bit | JRMark 1750) does not have enough Video RAM to process BD's, where as the HD4000 GPU on my Thinkpad (Intel (R) i5-3320M @ 2.6 GHz, 8 GB  RAM 1600 MHz | HD 4000 650 MHz, 800 Mhz | Win 7 64 bit | JRMark 3477) is not strong enough (~100% GPU Load) to play BD using Red October HQ.

The results are attached in the pdf.

That may give also others an idea why playing BD on their HW does not play smooth using Red October HQ.

So I will see which HTPC I buy for my living room in the next weeks. The ThinkPad its my daily working environment and I used it only for testing purpose.

Best regards

Manfred
Logged
WS (AMD Ryzen 7 5700G, 32 GB DDR4-3200, 8=2x2+4 TB SDD, LG 34UC98-W)-USB|ADI-2 DAC FS|Canton AM5 - File Server (i3-3.9 GHz, 16GB ECC DDR4-2400, 46 TB disk space) - Media Renderer (i3-3.8 GHz, 8GB DDR4-2133, GTX 960)-USB|Devialet D220 Pro|Audeze LCD 2|B&W 804S|LG 4K OLED )

felix2

  • Junior Woodchuck
  • **
  • Posts: 58
Re: Min CPU/GPU to support .... ?
« Reply #10 on: January 15, 2014, 10:04:12 pm »

To all who have commented:
My setup is:
- HTPC based on AMD Athlon x4 645 @3.1GHz, RAM is 6GB, video card is AMD HD5670 1GB GDDR5. [This hardware is nothing to brag about.] Win8.1-64, MC19. MC19 video renderer is Red October HD.
- Video server based on Core2 Dual 1.8GHz running WinServer 2011.
- LAN is gigabit

With the above moderate gear, I can play BD disc from the HTPC, and BD quality video files from the server, WITH ABSOLUTELY PERFECT QUALITY. Here's the performance numbers while playing BD quality video:
- CPU utilization 45%, peaks at 50%.
- RAM used no more than 1.5G
- MC19 process utilization about 38%, peak at 40%
- LAN utilization at 34 Mbps when streaming video files from server

As you can see my system is not being stressed at all. To estimate how much a video file playback demand on the computer, use:
   resolution X compressor type X overall bitrate

Resolution is obvious: 1080x1920 for HD. All interlaced video will be converted to progressive either by the player app or directly by the TV (digital TV screen can only do progressive)

Compressor is mostly H.264 or MPEG2 for older video. H.264 (also called AVC) is so advanced and efficient that it demands half as much as MPEG2 when decoding the same quality video. H.264 can be accelerated by hw but not MPEG2 but this must be enabled by the player app.

Overall bitrate = bitrate of video + bitrate of audio. For a given resolution, the bitrate measures the degree of compression by the coder. You can have 1080 video from YouTube with a low bitrate of 10Mbps, from your prosumer grade camcorder with a bitrate of 15Mbps, from satellite TV with a bitrate of 18-20Mbps, from broadcast video camera with a bitrate of 25Mbps, and finally from a Hollywood blockbuster movie in BD with the ultimate bitrate of 35Mbps. All are 1080 resolution, but the hardware/app able to play such a wide range of bitrates is what mess up a lot of HTPC builders.

It is not just the video resolution but the overall bitrate that's important in setting up a HTPC system. So my quite moderate HTPC hardware is able to play HD up to 35Mbps perfectly under MC19. But it fails if I use Cyberlink PowerDVD - because this app cannot handle bitrate above 20Mbps. I know of no other app that can play a video file (which I edited and rendered) with a bitrate of 36.5Mbps - JRiver deserve the highest accolade for designing such a fine product. 

Note that I put all the hard work of playback to the HTPC. I do not want my server to do any decoding, just send the files to the HTPC. The reasons are:
1) If I use the server to decode video, it will quickly get overload after a few streams.
2) I can't use GPU hardware to help in the decoding because the output does not go to the HDMI port, but goes back to CPU memory, then get sent to the network under the considerable overhead of TCP/IP.
3) I have to spend a great deal of money to build a video capable server. Video server (as opposed to a file server that happens to server video files) is a complicated and expensive proposition. Just ask Netflix.

I use Windows Server 2011 to run my server because of its absolute robustness in all aspects. It is quite different than regular Windows 7-64.

I've tried various configurations of madVR. The difference to utilization and video quality is quite minor. If it works in default it will work in all. But if it does not work well in default configuration then the problem must first be found elsewhere.

One source of jitter and such problem many overlook is the HDMI path from the HTPC video card port all the way to the TV. Many goes thru a AV receiver while some has HDMI splitter/switch in between. ALL must be able to handle the combined resolution X bitrate you send over from the HTPC! Or you WILL see all kinds of jitter, bad color or strange problems. Especially the AV receiver.

I am just telling you my experience. Take what you can if it helps. All the best to my fellow HTPC freaks!!!

Logged
Pages: [1]   Go Up