INTERACT FORUM
Devices => Video Cards, Monitors, Televisions, and Projectors => Topic started by: newguy1 on October 18, 2014, 11:40:10 am
-
Hi All
I use OTA in JRiver and I'm trying to find a setting that will work with MadVR and I can't get it to work well, looking for advice.
I've shown where I've got the settings in MadVR below, but I think that it should do much better than that....am I wrong?
Thanks,
David
JRMark is about 4800, depending on system usage
My System
i7 4770k w/Z87 MoBo
10GB Ram
GTX760 (Zotac)
ASUS Xonar STX
Samsung UN46EH5000
MadVR Settings
devices - Removed all but my tv and the generic PnP. It looks like it sees the generic as the active monitor, not the tv.
Processing - Stock settings
Scaling -
Chroma - Jinc, 3 taps
Image Doubling - off
Image upscaling - JINC, 4 taps
Image downscaling - Spline, 3 taps
Rendering - Stock settings except uncheck all 'trade quality for performance' boxes
user interface - stock settings
-
Check your rendering times (CTRL+J) to see that the render times are lower than the vsync interval.
It should only have to be lower than the movie frame interval, but try that first.
Your 760 might not be able to handle 4-tap Jinc upscaling. Try Lanczos 3 with anti-ringing.
Use Random Dithering or Ordered Dithering with "change dither for every frame" enabled.
-
Looking at the ctrl-j screen now, if the display is showing 59.999x and the source is 29.970, is that a bad thing?
I'm guessing that MadVR renders 60 frames instead of 30....
-
Looking at the ctrl-j screen now, if the display is showing 59.999x and the source is 29.970, is that a bad thing?
No, that is what you would expect.
The vsync time will be ~16.67ms and the movie frame interval will be ~33.3ms.
As long as Smooth Motion is not active, your render times should only have to be below the movie frame interval.
-
The ~16.67 and ~33.3 were there for a lower quality clip I played.
I'm trying a 23.97fps now and my vsync is 41.72ms and the movie frame interval is 41.71ms.
Is that bad?
-
1000/23.97 = 41.71ms
1000/29.97 = 33.36ms
1000/59.94 = 16.67ms
etc.
You need to use settings which have render times lower than the movie frame interval, and ideally below the vsync time.
The higher the framerate/refresh rate, the less time the GPU has to process the image.
-
Gotcha. I noticed that the render time stayed below for the most part, but crept up to 44ms or so.
Been playing with it a little, changed some of the settings, down to about 36ms, but not sure if it's the best use of resources:
Chroma Upscaling - Lanzos 3 tap
Image doubling
Luma - 16 neurons
Chroma - 16 neurons
Image upscaling - Lanczos 3 tap
-
Well, getting this figured out, I think that it's a lot more straight forward than I originally thought.
I do have some questions about the profiles. I think that they'll do what I want.
When using the tv function, I can't line double, but I can when I'm watching a show. I think that I should be able to make profiles based on the conditions I'm thinking of.
My question is, are there a few profiles available to look at in order to reduce my learning curve?
-
There are some details on the profiles here: http://forum.doom9.org/showpost.php?p=1271417&postcount=3
Right now, I'm using:
if (srcHeight<=360) "<SD"
if (srcHeight>360) and (srcHeight<=480) "NTSC"
if (srcHeight>480) and (srcHeight<=576) "PAL"
if (srcHeight>576) and (srcHeight<=1080) "HD"
if (srcHeight>1080) "UHD"
Though I really need to update that.
I'm guessing you would use a rule based on srcInterlaced to set different scaling quality depending on whether the source it interlaced or not.
-
Haven't got to the coding yet, but having fun trying to figure MadVR out!
I'm loving the line doubling by the way, it easily (subjectively) doubles the quality.
What I'm wondering is why my HDHomeRun automatically changes tv to 59.941fps and calls it a movie, any ideas? I'm guessing that tv should be 30 (or so)fps.
-
I was wrong, it is supposed to be at 60fps on some digital tv signals...