INTERACT FORUM
More => Old Versions => Media Center 16 (Development Ended) => Topic started by: bpchia on October 02, 2011, 09:53:42 pm
-
Hi all,
I have the following system:
Intel i5 750 2.66GHz (Lynnfield)
4GB RAM
ATI Radeon HD 5670
Win 7 32bit
I know that Nvidia users can use LAV CUVID to hardware decode video and use with MadVR. I am wondering if AMD/ATI users can use Cyberlink Video Decoder (Power DVD 11) in HAM mode with MadVR? I've tried and it falls back to ffdshow...
Thanks, Ben
-
Sorry for bumping this but wondering if anyone has any input on this?
-
I know that SamuriHL was playing with this - you may want to PM him to alert him to this thread.
-
Thanks jmone
-
1) Register CLCvd.ax in your Cyberlink PDVD11 directory
2) Set MC to use Red October HQ with additional filters
3) Add the "Cyberlink Video Decoder (PDVD11)" filter to "Other Filters" on the files you want to use it on
4) Configure the video decoder and turn on HAM mode
That's it, really. Nothing more to do than that. It should be noted that I've switched to using LAV Video on all my machines including my laptop, so, I don't use the HAM mode filter anymore. But, AFAIK, it should still work.
-
I have registered CLCvd.ax and selected it in additional filters, but when I configure it I have a problem. When I select the HAM radio button and apply/OK, the filter was not used. I went back and checked the configuration of the decoder, and it had reverted back to SW (software) mode. I can't get the configuration to stick. I don't know if there is any other way to access the filter configuration to get it to stick. Also the "Video Mode" randomly selects "auto-select", "force bob" or "force weave".
-
The setting has never SHOWN to stick for me, but, there's a way to access custom properties while playing a video (I forget the magic combination...hold down control when opening the properties, or shift?) that will show you what it's actually using. It should say bitstreaming if I remember right. It's been several months now since I've used it so I've forgotten most of it.
-
I actually got it to work by installing MC17, the Cyberlink Decoder setting doesn't stick on the configuration screen, but when I right click when a video is playing and look at the filters (is this the "magic combination" you are talking of, to right click and select video?) the Cyberlink Decoder was being used and the HAM setting DID STICK. This did not work with MC16, the program fell back to using ffdshow (auto-configured). It was the latest version, any ideas why?
Anyway, the Cyberlink Video Decoder with HAM mode and madVR was unwatchable, stuttering badly...are there any madVR settings I need to change? I unchecked all the decoding options in madVR...
-
I'm really not sure on either issue. It's one of those things where I'd sit and play around with the settings until I got it to work. Start by setting all the scaling algorithms to bicubic and see if that's smooth. Actually I can't make the assumption that your video card is setup like mine. Ugh. I'll have to get back to you on the ccc settings to use. Not in front of that machine right now. What video driver version are you using?
-
Can I ask a newbie question about this discussion? I have a similar pc with a Radeon HD 6450 and also have PowerDVD 11 installed, and I'm wondering why you would want to run MC in the configuration you're suggesting below? I've watched the same movie in both PowerDVD 11 and MC16 w/ RedOctober HQ and they appear to be the same video quality. Why not just use RO HQ default settings?
-
I'm using the latest version of CCC.
Have you switched to LAV Video because of quality? Why is RO HQ still using ffdshow and not LAV Video?
Are you talking about adjusting the CCC settings or madVR settings?
bpalatt: the reason to use the Cyberlink Vid Dec in HAM mode is to offload the decoding to the GPU so there are more resources available for the CPU to render. Most important if you don't have a super fast PC, if you do then the default settings are good, probably better...
-
This topic completely fell off my radar. Sorry about that. I'm using lav video because it works on all my machines including my laptop. It replaced the need for ham mode because the code is so efficient. Quality doesn't enter into this discussion at all... It should be identical for all decoders unless they're doing something. The reason lav video isn't default just yet is because currently there's no deinterlacing. In any case did you get this straightened out yet?
-
Thanks for the reply...I'm not going with Cyberlink HAM mode anymore, too much fiddling around to get it to work, I'm not sure it would even work...my system is actually probably fast enough to run software decoders anyway...
Where in the chain do you do deinterlacing in your system? Can I just set CCC to do the deinterlacing if using LAV Video?
-
That's kind of a point of contention these days. I'm not sure if the ccc settings have any affect on deinterlacing. I do know that something in my chain is handling it, though. I've always assumed it was my tv. In any case hang in there. Looks like madshi has a trick up his sleeve coming for us in the future. Also know that nev is working on adding deinterlacing to lav video. I think everyone will be covered soon.
-
Yeah thanks...I also thought my TV would handle it...I've got a mid range Sony LCD, should deinterlace as well or better than a computer, right?
-
Honestly I have no idea. Deinterlacing is sort of a black box. LAV Video using CUVID (nVidia only) handles it quite well and that's how I prefer to do it on my nVidia machine. My AMD machine, I have it set in the CCC to deinterlace but I have no idea if it's happening there or if my Sony SXRD TV is doing it. I'd prefer it to be handled in the renderer personally, but, I don't much care where it's done as long as the quality is good.