From my experience the card is sensitive to input signal level - too low or too high level causes occasional pixelation or worst case picture freezeing. I have a 4-way RF splitter off the aerial and had to install a masthead amp which ended-up with too high signal hitting the tuner card and had to install a 3dB attenuator to stop pixelation.
Edit: I should mention the too low/high signal level issue is fairly generic, as I've owned a number of computer tuner cards over the years (Hauppauge PVR150 & HVR2200, ATI 550, DigitalNow Dual-Hybrid S2 & QUAD DVBT) and they are all sensitive to too low signal or too high/overload signal, compared to normal consumer electronics like TV's and DVR's.
Digital tuners have no effect on the quality of the image, they transparently pass the digital 'ones' and 'zeros' to the decoder. In my experience audio/video sync issues are caused either by the broadcaster (seen a lot here in NZ with live news broadcasts) or caused by the decoder/renderer trains.
Edit: I forgot to mention that another common cause of audio/video sync issues is TV's themselves. Their video processing engines can cause varying delay in getting the image onto the screen. My Samsung 'C' generation 6200 was really bad at causing variable video delay on the HDMI input when I first used it, but then I discovered that these modern TV's have a 'game mode' which is low-latency processing mode (it knocks-out several processing intensive features). As it turns out, Samsung TV's are well known for having high latency, but what took me by surprise was the variable nature of the latency and the severity (up to 1 second delayed from the audio). Now using game mode, all my audio/video sync issues have disappeared (bar from the broadcaster of course).