INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: Hardware decoding for AMD/ATI with MadVR  (Read 10788 times)

bpchia

  • Recent member
  • *
  • Posts: 45
Hardware decoding for AMD/ATI with MadVR
« on: October 02, 2011, 09:53:42 pm »

Hi all,

I have the following system:
Intel i5 750 2.66GHz (Lynnfield)
4GB RAM
ATI Radeon HD 5670
Win 7 32bit

I know that Nvidia users can use LAV CUVID to hardware decode video and use with MadVR.  I am wondering if AMD/ATI users can use Cyberlink Video Decoder (Power DVD 11) in HAM mode with MadVR?  I've tried and it falls back to ffdshow...

Thanks, Ben
Logged

bpchia

  • Recent member
  • *
  • Posts: 45
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #1 on: October 20, 2011, 09:25:34 pm »

Sorry for bumping this but wondering if anyone has any input on this?
Logged

jmone

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 14465
  • I won! I won!
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #2 on: October 20, 2011, 09:45:06 pm »

I know that SamuriHL was playing with this - you may want to PM him to alert him to this thread.
Logged
JRiver CEO Elect

bpchia

  • Recent member
  • *
  • Posts: 45
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #3 on: October 20, 2011, 11:58:52 pm »

Thanks jmone
Logged

SamuriHL

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1041
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #4 on: October 21, 2011, 12:01:26 pm »

1) Register CLCvd.ax in your Cyberlink PDVD11 directory
2) Set MC to use Red October HQ with additional filters
3) Add the "Cyberlink Video Decoder (PDVD11)" filter to "Other Filters" on the files you want to use it on
4) Configure the video decoder and turn on HAM mode

That's it, really.  Nothing more to do than that.  It should be noted that I've switched to using LAV Video on all my machines including my laptop, so, I don't use the HAM mode filter anymore.  But, AFAIK, it should still work.
Logged

bpchia

  • Recent member
  • *
  • Posts: 45
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #5 on: October 21, 2011, 06:48:04 pm »

I have registered CLCvd.ax and selected it in additional filters, but when I configure it I have a problem.  When I select the HAM radio button and apply/OK, the filter was not used.  I went back and checked the configuration of the decoder, and it had reverted back to SW (software) mode.  I can't get the configuration to stick.  I don't know if there is any other way to access the filter configuration to get it to stick.  Also the "Video Mode" randomly selects "auto-select", "force bob" or "force weave".
Logged

SamuriHL

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1041
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #6 on: October 21, 2011, 07:30:14 pm »

The setting has never SHOWN to stick for me, but, there's a way to access custom properties while playing a video (I forget the magic combination...hold down control when opening the properties, or shift?) that will show you what it's actually using.  It should say bitstreaming if I remember right.  It's been several months now since I've used it so I've forgotten most of it.
Logged

bpchia

  • Recent member
  • *
  • Posts: 45
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #7 on: October 22, 2011, 02:34:37 am »

I actually got it to work by installing MC17, the Cyberlink Decoder setting doesn't stick on the configuration screen, but when I right click when a video is playing and look at the filters (is this the "magic combination" you are talking of, to right click and select video?) the Cyberlink Decoder was being used and the HAM setting DID STICK.  This did not work with MC16, the program fell back to using ffdshow (auto-configured).  It was the latest version, any ideas why?

Anyway, the Cyberlink Video Decoder with HAM mode and madVR was unwatchable, stuttering badly...are there any madVR settings I need to change?  I unchecked all the decoding options in madVR...
Logged

SamuriHL

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1041
Re: Re: Hardware decoding for AMD/ATI with MadVR
« Reply #8 on: October 22, 2011, 08:17:23 am »

I'm really not sure on either issue. It's one of those things where I'd sit and play around with the settings until I got it to work. Start by setting all the scaling algorithms to bicubic and see if that's smooth. Actually I can't make the assumption that your video card is setup like mine. Ugh. I'll have to get back to you on the ccc settings to use. Not in front of that machine right now. What video driver version are you using?
Logged

bpalatt

  • Recent member
  • *
  • Posts: 8
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #9 on: October 22, 2011, 08:54:35 am »

Can I ask a newbie question about this discussion?  I have a similar pc with a Radeon HD 6450 and also have PowerDVD 11 installed, and I'm wondering why you would want to run MC in the configuration you're suggesting below?  I've watched the same movie in both PowerDVD 11 and MC16 w/ RedOctober HQ and they appear to be the same video quality.  Why not just use RO HQ default settings?
Logged

bpchia

  • Recent member
  • *
  • Posts: 45
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #10 on: October 23, 2011, 12:15:50 am »

I'm using the latest version of CCC.

Have you switched to LAV Video because of quality?  Why is RO HQ still using ffdshow and not LAV Video?

Are you talking about adjusting the CCC settings or madVR settings?

bpalatt: the reason to use the Cyberlink Vid Dec in HAM mode is to offload the decoding to the GPU so there are more resources available for the CPU to render.  Most important if you don't have a super fast PC, if you do then the default settings are good, probably better...
Logged

SamuriHL

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1041
Re: Re: Hardware decoding for AMD/ATI with MadVR
« Reply #11 on: October 29, 2011, 07:40:09 am »

This topic completely fell off my radar. Sorry about that. I'm using lav video because it works on all my machines including my laptop. It replaced the need for ham mode because the code is so efficient. Quality doesn't enter into this discussion at all... It should be identical for all decoders unless they're doing something. The reason lav video isn't default just yet is because currently there's no deinterlacing. In any case did you get this straightened out yet?
Logged

bpchia

  • Recent member
  • *
  • Posts: 45
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #12 on: October 29, 2011, 10:31:19 pm »

Thanks for the reply...I'm not going with Cyberlink HAM mode anymore, too much fiddling around to get it to work, I'm not sure it would even work...my system is actually probably fast enough to run software decoders anyway...

Where in the chain do you do deinterlacing in your system?  Can I just set CCC to do the deinterlacing if using LAV Video?
Logged

SamuriHL

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1041
Re: Re: Hardware decoding for AMD/ATI with MadVR
« Reply #13 on: October 30, 2011, 09:01:31 am »

That's kind of a point of contention these days. I'm not sure if the ccc settings have any affect on deinterlacing. I do know that something in my chain is handling it, though. I've always assumed it was my tv. In any case hang in there. Looks like madshi has a trick up his sleeve coming for us in the future. Also know that nev is working on adding deinterlacing to lav video. I think everyone will be covered soon.
Logged

bpchia

  • Recent member
  • *
  • Posts: 45
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #14 on: October 31, 2011, 05:01:56 pm »

Yeah thanks...I also thought my TV would handle it...I've got a mid range Sony LCD, should deinterlace as well or better than a computer, right?
Logged

SamuriHL

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1041
Re: Hardware decoding for AMD/ATI with MadVR
« Reply #15 on: October 31, 2011, 05:20:16 pm »

Honestly I have no idea.  Deinterlacing is sort of a black box.  LAV Video using CUVID (nVidia only) handles it quite well and that's how I prefer to do it on my nVidia machine.  My AMD machine, I have it set in the CCC to deinterlace but I have no idea if it's happening there or if my Sony SXRD TV is doing it.  I'd prefer it to be handled in the renderer personally, but, I don't much care where it's done as long as the quality is good.
Logged
Pages: [1]   Go Up