To all who have commented:
My setup is:
- HTPC based on AMD Athlon x4 645 @3.1GHz, RAM is 6GB, video card is AMD HD5670 1GB GDDR5. [This hardware is nothing to brag about.] Win8.1-64, MC19. MC19 video renderer is Red October HD.
- Video server based on Core2 Dual 1.8GHz running WinServer 2011.
- LAN is gigabit
With the above moderate gear, I can play BD disc from the HTPC, and BD quality video files from the server, WITH ABSOLUTELY PERFECT QUALITY. Here's the performance numbers while playing BD quality video:
- CPU utilization 45%, peaks at 50%.
- RAM used no more than 1.5G
- MC19 process utilization about 38%, peak at 40%
- LAN utilization at 34 Mbps when streaming video files from server
As you can see my system is not being stressed at all. To estimate how much a video file playback demand on the computer, use:
resolution X compressor type X overall bitrate
Resolution is obvious: 1080x1920 for HD. All interlaced video will be converted to progressive either by the player app or directly by the TV (digital TV screen can only do progressive)
Compressor is mostly H.264 or MPEG2 for older video. H.264 (also called AVC) is so advanced and efficient that it demands half as much as MPEG2 when decoding the same quality video. H.264 can be accelerated by hw but not MPEG2 but this must be enabled by the player app.
Overall bitrate = bitrate of video + bitrate of audio. For a given resolution, the bitrate measures the degree of compression by the coder. You can have 1080 video from YouTube with a low bitrate of 10Mbps, from your prosumer grade camcorder with a bitrate of 15Mbps, from satellite TV with a bitrate of 18-20Mbps, from broadcast video camera with a bitrate of 25Mbps, and finally from a Hollywood blockbuster movie in BD with the ultimate bitrate of 35Mbps. All are 1080 resolution, but the hardware/app able to play such a wide range of bitrates is what mess up a lot of HTPC builders.
It is not just the video resolution but the overall bitrate that's important in setting up a HTPC system. So my quite moderate HTPC hardware is able to play HD up to 35Mbps perfectly under MC19. But it fails if I use Cyberlink PowerDVD - because this app cannot handle bitrate above 20Mbps. I know of no other app that can play a video file (which I edited and rendered) with a bitrate of 36.5Mbps - JRiver deserve the highest accolade for designing such a fine product.
Note that I put all the hard work of playback to the HTPC. I do not want my server to do any decoding, just send the files to the HTPC. The reasons are:
1) If I use the server to decode video, it will quickly get overload after a few streams.
2) I can't use GPU hardware to help in the decoding because the output does not go to the HDMI port, but goes back to CPU memory, then get sent to the network under the considerable overhead of TCP/IP.
3) I have to spend a great deal of money to build a video capable server. Video server (as opposed to a file server that happens to server video files) is a complicated and expensive proposition. Just ask Netflix.
I use Windows Server 2011 to run my server because of its absolute robustness in all aspects. It is quite different than regular Windows 7-64.
I've tried various configurations of madVR. The difference to utilization and video quality is quite minor. If it works in default it will work in all. But if it does not work well in default configuration then the problem must first be found elsewhere.
One source of jitter and such problem many overlook is the HDMI path from the HTPC video card port all the way to the TV. Many goes thru a AV receiver while some has HDMI splitter/switch in between. ALL must be able to handle the combined resolution X bitrate you send over from the HTPC! Or you WILL see all kinds of jitter, bad color or strange problems. Especially the AV receiver.
I am just telling you my experience. Take what you can if it helps. All the best to my fellow HTPC freaks!!!