INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: Power Consumption: Red October Standard vs HQ (with LAV CUVID)  (Read 9279 times)

Matt

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 42048
  • Shoes gone again!

Test
I was wondering if GPU acceleration actually saved power.  So I did some tests of power consumption when watching a 1080p Blu-ray from a hard drive.

Obviously the answer will vary with the hardware you have, but I thought the results might be interesting none the less.

Hardware
CPU: Intel 2600k (32 nm)
Video card: nVidia GTX 480 (40nm)
Display: 1920x1200 monitor (so no resizing of the 1080p source material)

Results
Power usage, measured at the wall with a Kill-a-Watt:
Nothing playing: 120 watts
Red October Standard: 150 watts (for the first 20 seconds, it's closer to 200 watts, then settles and varies around 150 by about 10 watts)
Red October w/ additional filters (EVR + LAV CUVID): 183 watts (varying by 2 or 3 watts)
Red October HQ (madVR + LAV CUVID): 190 watts (varying by 2 or 3 watts)

Conclusions
madVR: 7 watts more than EVR
LAV CUVID: 33 watts more than CPU decoding
Cost difference if you watch a movie a day: $3.50 each year [2 hours/day, $0.12/kWh]
Logged
Matt Ashland, JRiver Media Center

jmone

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 14277
  • I won! I won!
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #1 on: July 29, 2011, 10:58:26 pm »

Nice.... Can I have some of that $0.12/KWh please!
Logged
JRiver CEO Elect

jmone

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 14277
  • I won! I won!
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #2 on: July 29, 2011, 11:12:31 pm »

Also being American why the metric unit (Kilo Watts) ....surely you measure such things in "pound candles" or ?  ;D
Logged
JRiver CEO Elect

mojave

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 3732
  • Requires "iTunes or better" so I installed JRiver
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #3 on: July 29, 2011, 11:39:08 pm »

Nice work, Matt.

Nice.... Can I have some of that $0.12/KWh please!
I'm at only $.045/KWh!  :o
Logged

jmone

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 14277
  • I won! I won!
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #4 on: July 29, 2011, 11:55:51 pm »

 :'(  we are getting done over down under....and they want to add a carbon dioxide tax on top now as well! Std rates are 22.66cents per kWh for the first 1,750kWh per quarter then 32cents per kWh after that!
Logged
JRiver CEO Elect

fitbrit

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 4877
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #5 on: August 01, 2011, 10:02:28 pm »

Test
I was wondering if GPU acceleration actually saved power.  So I did some tests of power consumption when watching a 1080p Blu-ray from a hard drive.

Obviously the answer will vary with the hardware you have, but I thought the results might be interesting none the less.

Hardware
CPU: Intel 2600k (32 nm)
Video card: nVidia GTX 480 (40nm)
Display: 1920x1200 monitor (so no resizing of the 1080p source material)

Results
Power usage, measured at the wall with a Kill-a-Watt:
Nothing playing: 120 watts
Red October Standard: 150 watts (for the first 20 seconds, it's closer to 200 watts, then settles and varies around 150 by about 10 watts)
Red October w/ additional filters (EVR + LAV CUVID): 183 watts (varying by 2 or 3 watts)
Red October HQ (madVR + LAV CUVID): 190 watts (varying by 2 or 3 watts)

Conclusions
madVR: 7 watts more than EVR
LAV CUVID: 33 watts more than CPU decoding
Cost difference if you watch a movie a day: $3.50 each year [2 hours/day, $0.12/kWh]

Hi Matt.
You kind of touched upon this, but:
I just want to emphasise that the GT480 is a gaming card, not necessarily the type one would want in a home theatre PC. Of course it's power hungry and may use up more energy than it should when being used for CUDA. It would be cool to see the same tests using a GT240 or GTS450, for example, sticking with nVIDIA. A similar test with a passive Radeon 5450 and DXVA would yield VERY different results, I'd venture.
Logged

Daydream

  • Citizen of the Universe
  • *****
  • Posts: 770
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #6 on: August 02, 2011, 04:11:27 am »

:'(  we are getting done over down under....and they want to add a carbon dioxide tax on top now as well! Std rates are 22.66cents per kWh for the first 1,750kWh per quarter then 32cents per kWh after that!

There's still a ban on nuclear power over there right? As opposed to the 65 plants here... Might need to think carefully on which hemisphere you wanna build your HTPC :)
Logged

Mike Noe

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 792
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #7 on: August 02, 2011, 03:22:09 pm »

Of course it's power hungry and may use up more energy than it should when being used for CUDA. ...

Unless I'm mistaken, nevcariel's LAVCUVID does not actually use CUDA for the decoding.  The CUDA interface is simply used to access the VPx engine.
Logged
openSUSE TW/Plasma5 x86_64 | Win10Pro/RX560
S.M.S.L USB-DAC => Transcendent GG Pre (kit) => Transcendent mono OTLs (kit)
(heavily modded) Hammer Dynamics Super-12s (kit)
(optionally) VonSchweikert VR8s

fitbrit

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 4877
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #8 on: August 02, 2011, 03:50:41 pm »

Unless I'm mistaken, nevcariel's LAVCUVID does not actually use CUDA for the decoding.  The CUDA interface is simply used to access the VPx engine.

I think you might be mistaken, unless I am not interpreting this correctly.
Logged

gtgray

  • Galactic Citizen
  • ****
  • Posts: 261
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #9 on: August 02, 2011, 04:08:57 pm »

I think the point is well made that gaming cards while very powerful are vey inefficient for something madVR. A 440GT or the GT 545 like I use run Cuvid and madVR very well. My Sandy Bridge i 3 GT 545 uses around 75 watts playing back very high bit rate content. 35 watts at idle.

This is my desktop rig,

37" Panny LED -37 watts
591 Denon (idling) 35 watts
i3 2100 GT545 (idling) 35 watt
-------------------------------------
107 watts idling,
under 200 watts total running high bit rate blu-ray at or near reference volume

10.1 cent per kwh here.


Logged

Mike Noe

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 792
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #10 on: August 02, 2011, 04:46:32 pm »

I think you might be mistaken, unless I am not interpreting this correctly.

Not to argue and nev could give the definitive word, but it seems the third sentence in that post confirms my point:

Quote from: nev
It is a DirectShow Video Decoder utilizing the NVIDIA hardware decoder engine through the CUDA Video Decoding API ("CUVID").
Logged
openSUSE TW/Plasma5 x86_64 | Win10Pro/RX560
S.M.S.L USB-DAC => Transcendent GG Pre (kit) => Transcendent mono OTLs (kit)
(heavily modded) Hammer Dynamics Super-12s (kit)
(optionally) VonSchweikert VR8s

fitbrit

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 4877
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #11 on: August 02, 2011, 07:01:50 pm »

Not to argue and nev could give the definitive word, but it seems the third sentence in that post confirms my point:

Quote from: nev
It is a DirectShow Video Decoder utilizing the NVIDIA hardware decoder engine through the CUDA Video Decoding API ("CUVID").

Like I said, I might not be interpreting that correctly, or more likely, I've always considered that (incorrectly, probably) to mean CUDA decoding. I always thought that nVidia specific hardware acceleration was CUDA.
Logged

Mike Noe

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 792
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #12 on: August 02, 2011, 07:37:06 pm »

FitBrit, yup, here is some info you might be interested in:  nVidia PureVideo
Logged
openSUSE TW/Plasma5 x86_64 | Win10Pro/RX560
S.M.S.L USB-DAC => Transcendent GG Pre (kit) => Transcendent mono OTLs (kit)
(heavily modded) Hammer Dynamics Super-12s (kit)
(optionally) VonSchweikert VR8s

Daydream

  • Citizen of the Universe
  • *****
  • Posts: 770
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #13 on: August 02, 2011, 07:54:33 pm »

I'm not an expert in this field, not as much anyway as people programming for these features but let's settle it once and for all:
- PureVideo is a hardware feature on Nvidia GPUs. It's a circuit inside the GPU.
- CUDA gives programmers a way to leverage the parallel processing power of Nvidia GPUs for other tasks than rendering polygons and smokes in games. For cryptography for example. Or for, hehe, video processing.
- CUDA Video Decoding API (Cuvid) is a way to allow programmers to access the above PureVideo (or VPx, there are like 5 generations; and many name confusions in Nvidia nomenclature), to take care of the straight up decoding, plus the rest of the GPU, for postprocessing tasks that can be done with CUDA (of major interest here deinterlacing)
- of importance is the realization that all these post-processing tasks, deinterlacing included (but also de-noising and a few other that the user may or may not enable) are always done in shaders meaning by the GPU, and not by that special circuit that just decodes frames.

In details that exceed my knowledge, the bottom line is that with CUVID you can decode the frames using the dedicated circuit on the GPU, process the decoded frames by the shaders in the GPU (say, deinterlace them) and then transfer them to the system memory using some more of CUDA magic. Somewhere in between all this you can put madVR with its own magic. To note is that there is a way to control how this process work (Nvidia supplied APIs, SDKs and whatnot) and therefore write programs for it.


In AMD land, we have UVD (1-2-2.2-3) as the piece of circuitry that does the decoding. And we have OpenCL to harness the GPU power. For various obvious or less obvious reasons (I'm no programmer so I don't know that much) it's not as straightforward, well documentated or as easy to work with OpenCL as with CUDA. Hence no specifically written goodies for AMD. Then we're left with DXVA.

DXVA is a Microsoft API and works with both AMD and Nvidia GPUs. Of importance here is that DXVA can access the hardware decoding and deinterlacing features on either brand of GPUs. But. For whatever reason that I don't have the knowledge to explain it properly, DXVA doesn't work with madVR. You'd have to write a decoder that taps into the HW resources, decodes and deinterlace frames and passes them to madVR in the manner that madVR expects them (and there's some memory bandwidth requirements that have to be met or everything is a slideshow). The guy that did this already has his own closed source player (PotPlayer) and given that be doesn't speak much English the chances that he would've cooperated with the team developping MPC-HC (and thus having free-for-all standalone DirectShow filters that could connect to madVR) fell apart.

Logged

glynor

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 19608
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #14 on: August 03, 2011, 09:04:32 am »

For various obvious or less obvious reasons (I'm no programmer so I don't know that much) it's not as straightforward, well documentated or as easy to work with OpenCL as with CUDA.

The reasons are actually almost entirely marketing driven.

Nvidia pushed CUDA very hard, and paid out millions of dollars to developers (in both direct cash payments, and in support costs) to "promote" its use.  By the way, CUDA is not really a "technology" in the classic sense of the term, but a broad marketing term for a collection of programming interfaces designed to work with Nvidia GPUs.  The goal of this marketing move was simple: drive adoption of CUDA (a proprietary "language") which requires their GPUs.  Then CUDA support would be a viable "value-add" for their GPUs, which would allow them to market them to consumers more effectively (you can't use that AMD card, even if it is faster/better/cheaper, because it doesn't support CUDA).  AMD has a similar "product" called Stream Computing, but they aren't "pushing" quite the same thing.

Apple, AMD, and Intel (along with a variety of other companies) are pushing an interoperable GPGPU programming standard called OpenCL (while Microsoft is pushing DirectCompute).  These systems all do essentially the same thing, they allow the programmer to access the massive parallel computational resources available on modern GPUs and multi-core CPUs.  However, with less "skin in the game" they didn't market the technologies as directly, or with as much money or support (OpenCL got "support" from Apple by them designing it and building it into their OS).  Therefore, adoption by third-party developers (particularly smaller ones) has been slower.  However, OpenCL was actually developed by Apple along with Nvidia.  It is absolutely based on CUDA, and can sort-of be viewed as a CUDA+ that can run on a much wider variety of hardware (and isn't artificially limited on multicore CPUs like CUDA-proper for marketing reasons).

There are certainly some advantages to the current CUDA approach.  However, there are some huge advantages to developing interoperable code, and new C++ programming interfaces are coming around the bend.  And, because CUDA did NOT get the kind of massive adoption that Nvidia was looking for (I think mainly because Fermi was underperforming and terribly power hungry with the first release), Nvidia has dramatically reduced the amount of money they're dumping into that hole for now.  CUDA never caught on like wildfire like they'd been hoping, and, frankly, Sandy Bridge is so powerful that you really don't need the GPU acceleration for most things (and when you get right down to it, current GPUs are still not perfectly suited to GPGPU style usage).

Interoperable GPGPU APIs are almost certainly the future (as is a unified memory address space, one of the biggest problems with current GPUs for general purpose programming).
Logged
"Some cultures are defined by their relationship to cheese."

Visit me on the Interweb Thingie: http://glynor.com/

Mike Noe

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 792
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #15 on: August 06, 2011, 07:41:43 pm »

I just got a lower-end Toshiba Llano machine and have been testing some this weekend with MPC-HC and MC16, figured I'd post some quick info:

Win7 Home Premium 64bit
Playback video - Blade Runner Blu-Ray, ripped to MKV.
I use a Kill-a-watt at the wall and Toshiba has an Eco tool that shows power draw.

Hardware
CPU: AMD A6-3400M QuadCore (6Gb RAM)
Video card: HD 6520G xGPU
Display: 1600x900 monitor

Results (Eco-mode OFF)

Nothing playing: 15 watts
Red October Standard (DXVA ON): 30 watts
Red October HQ w/ additional filters (madVR + LAVAudio): 31 watts

Results (Toshiba Eco-mode ON)

Nothing playing: 11 watts
Red October Standard (DXVA ON): 24 watts
Red October HQ w/ additional filters (madVR + LAVAudio): 29 watts

Flawless smooth playback, extremely quiet setup.
Logged
openSUSE TW/Plasma5 x86_64 | Win10Pro/RX560
S.M.S.L USB-DAC => Transcendent GG Pre (kit) => Transcendent mono OTLs (kit)
(heavily modded) Hammer Dynamics Super-12s (kit)
(optionally) VonSchweikert VR8s

JustinChase

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 3273
  • Getting older every day
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #16 on: August 06, 2011, 08:59:43 pm »

Red October HQ w/ additional filters (madVR + LAVAudio): 29 watts

I'm curious why you're adding these here?  They are the default for RO HQ, no?  I know adding them will allow you to use "your" versions, and I can't find a way to change the settings on the RO installed filters (rightfully so), so I'm just curious if you're doing any tweaking, or why you're adding these?

thanks.

Oh, and wow; only 29 watts?!?!  Impressive
Logged
pretend this is something funny

Mike Noe

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 792
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #17 on: August 07, 2011, 08:35:54 am »

The main reason is that I frequently switch between madVR decoding, LAV decoding and FFDshow decoding for testing purposes.  Sometimes I also switch to EVR for render.  I set LAVAudio/LAVVideo and then I have an external utility that changes the registry for the respective format.

I should also probably note that I've disabled LAN and BlueTooth, fwiw.
Logged
openSUSE TW/Plasma5 x86_64 | Win10Pro/RX560
S.M.S.L USB-DAC => Transcendent GG Pre (kit) => Transcendent mono OTLs (kit)
(heavily modded) Hammer Dynamics Super-12s (kit)
(optionally) VonSchweikert VR8s

Osho

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1211
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #18 on: August 08, 2011, 04:29:32 pm »

Interesting study :).

So, since MC16 defaults to Red October Standard, it should be able to claim tons of carbon credit ? :). Multiply number of current licensee * 1 hour a day video watching on average with default setting - that's a lot of greenhouse gases reduction !

Osho
Logged

Daydream

  • Citizen of the Universe
  • *****
  • Posts: 770
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #19 on: August 08, 2011, 07:15:50 pm »

[...]
Video card: HD 6520G xGPU
[...]
Red October HQ w/ additional filters (madVR + LAVAudio): 31 watts

Flawless smooth playback, extremely quiet setup.

Try an 1080/30i file and let us know. I'm extremely curious how that setup behaves. Looks like that memory controller in the mobile Llano is pulling some magic...?
Logged

jmone

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 14277
  • I won! I won!
Re: Power Consumption: Red October Standard vs HQ (with LAV CUVID)
« Reply #20 on: August 20, 2011, 12:37:10 am »

Conclusions
madVR: 7 watts more than EVR
LAV CUVID: 33 watts more than CPU decoding
Cost difference if you watch a movie a day: $3.50 each year [2 hours/day, $0.12/kWh]

Ahhh from 6233639 posting over at Doom9 it looks like CUVID is kicking the GPU into P0 mode when you only need P8 according to these posts...& you could cut 40watts off that!
 
Quote from: 6233638;1520556
I have just been looking into this today, after measuring the power consumption of my system.
 
It's not an ideal solution, but Nvidia Inspector's Multi Display Power Saver tool (right-click the "show overclocking" button to access it) allows you to force the GPU into the lower clocked P8 (Video) power state rather than going into the P0 (Full 3D) power state with CUVID.
 
It's a less-than-ideal tool to be using, but works for now, cutting my power consumption by 7W compared to CPU decoding.
 
 
Hopefully Nevcariel can figure out why CUVID is kicking the GPU into the Full 3D P0 power state rather than the Video P8 power state.

EDIT: Actually, this tool is awesome!
Rather than adding mpc-hc.exe to the P8 Applications list, I have set it to activate the P8 state by VPU usage, and set that to 40%. (may need to experiment with this)

With the HD videos I have tried so far, the lowest VPU usage for decoding has been around 45%, so this ensures it is not switching states in use. (that causes severe video stuttering)

However, it lets the GPU run in the extra-low-power 2D mode (P12) with less demanding videos (e.g. MP4 off the web) which has dropped my power consumption with them to 92W compared to 103W in P8. For reference, my system idles at 90W.

Quote from: 6233638;1520564
That was what I suspected. Please check my updated post above regarding the use of this tool.
 
It activates the P8 (Video) power state with HD videos, but for SD videos, it keeps it in the P12 (2D) state, which is just enough with my GTX 570 to let madVR perform the scaling I like (90% GPU load) dropping my power consumption a further 8W below DXVA.

92W with forced P12, MadVR/CUVID & SD video.
100W with EVR-CP/DXVA (P8) & SD/HD video.
103W with forced P8, MadVR/CUVID & HD video.
147W with standard P0 and MadVR/CUVID & SD/HD video.
 
Before I also started throttling my CPU, my system's power consumption was almost 180W today, so I have almost halved my power consumption in some situations. (CPU was also going to full clockspeed on playback previously)

Logged
JRiver CEO Elect
Pages: [1]   Go Up