I'm not sure where you take that info that latency causes distortion
I am not saying latency is causing distortion. Latency is just another word for delay and that's what's all about it. For playback it is not a real issue as long as the latency is kind of linear.
What I am saying is that non-linear latency-jitter or jitter is causing distortion, if it is induced to the data-stream which goes (realtime) right into the DAC-chip.
The RME side explains very well the differences.
If your soundcard and its drivers are able to cope with this non-linear-latency-jitter-in-the-datastream fact, you shouldn't have a big problem, as long as your soundcard does a real great job.
I know that some high-end soundcards/DACs do some buffering and reclocking before the stream leaves towards the DAC-chip to avoid these kind of Latency-Jitter or Jitter problems.
The major source for latency jitter is the interrupt handling. And you can't get around it.
This is by the way the reason why UNIX systems are the prefered systems when talking about real-time applications.
Example: Let say you got an USB soundcard. Playback and datastream is running towards your soundcard. Suddenly "Interrupt" pops-up saying - "Hold on - I have to read the data from HD first" to fill the 6000ms buffer. This one has higher priortity than USB. Next One. Graphic Card is saying: Stop if have to refresh the GUI.
And there are at least 40 other processes running from which a lot of processes have higher prority than the USB-bus in charge for your Latency-free realtime datastream.
My theory: If the buffer is 6000 or 12000 that's not the key issue. The key issue is to avoid long interrupts from e.g. your Harddisk during playback, which potentially influence negatively the USB realtime stream at a point in time where there is no chance to get the Latency-Jitter effects cleaned up anymore.
Anyhow. I think this is well known fact - at least to MS. I hope that the "exclusicve mode" in Vista will limit these kind of impacts.