Timing and EMI are the main issues than just 1's and 0's.]
EMI will
only impact audio quality when using analog outputs (post-DAC). Digital bitstreams on copper ARE certainly susceptible to EMI, like any electrical signal. The difference is that digital signals sent over a serial digital bus (like USB, 1394, or SPDIF) cannot "degrade" gracefully when they encounter EMI.
People think of EMI as impacting digital audio using the context they've learned over years of dealing with analog audio (ground loops and whatnot), but they're not the same thing at all. Any data coming over a digital bus will have a certain amount of redundancy built-in for error correction, so that if the bitstream becomes corrupted in route, the devices on the bus actually KNOW that it has been corrupted. It can either recover the data using the error correcting parity bits (and then when it plays it is still bit-for-bit perfect) or it will lose the data completely. At that point, either you'll hear a
very audible "gap" in playback, or if there is enough time in the buffer (and the bus supports it) the "recipient" can request that the "sender" re-send the missing data. EMI literally cannot "color" the music or anything like that over a digital bus. It can corrupt it, but you'd absolutely KNOW that something was going on, because you'd get either nothing at all or horrible "screeching and static" noises. Either the interference isn't enough to change the data, or the data stream becomes corrupted (more than the error correction can handle) and it fails. Flipping even one "bit" from an EMI blast doesn't "color" the sound,
it breaks it.
Timing is certainly something else, and that's why any DAC will buffer the input before sending it to output. The buffer on the DAC allows it to "smooth out" the jitter and other latency problems on the input signal. This latency can come from the sending device, or from the DAC itself (latency can be introduced by the DAC itself as it processes error correction). As long as the latency doesn't exceed the buffer size of the DAC, the latency also should not have an audible impact (unless the DAC is very poorly designed). If a buffer underrun occurs, again, the audio will actually just stop or "skip", not change in "quality".
Of course, computers are much more variable in the workload they have to process, so the jitter patterns are possibly substantially more complex than what you'd get off of a standalone component (though not always, I've seen standalone devices outrun their buffers plenty of times, especially when loading a particularly "complex" feature of some kind). So, it is theoretically possible (though unlikely), that you could end up with audible problems due to jitter if there is so serious that it is constantly "on the edge" of causing a buffer underrun on the DAC. If this is the case, you probably either have serious EMI problems or a very underpowered PC. Audio is just not that "difficult" of a computational task for modern CPUs to handle.
Likewise, computers are absolutely NOT the only devices that suffer from jitter. In fact, that BluRay player (or DVR, game console, DVD player, or even standalone CD player) you have really IS actually a
computer inside. If you tear the case open and look, you'll see that they use many of the
exact same components as you'd find on your PC's motherboard. There are only so many chip designs for things like DSPs and whatnot out there, and real custom silicon is incredibly expensive (we're talking Billions with a B just to get test spins of silicon designs which might not even work). I'm sure there are a few audio devices that use
FPGAs to do some processing, but those are still expensive and SLOW (and difficult to program effectively without a large team of highly-paid chip engineers). I'd be willing to bet that actual custom silicon (where they contract TSMC or TI to build them a real chip) in the audio processing space is either completely non-existent or incredibly rare and reserved for things way out of all of our price ranges. There just isn't a good reason to do it and the costs would price you out of even the audiophile market without economies of scale. Even if they DID do it, I would be very skeptical that they could do it well. Custom silicon design is an incredibly complex and expensive process to do correctly, and there are all sorts of things that can wrong. Custom is absolutely NOT better. Custom is generally LESS reliable.
Now, to be clear, that isn't to say that there isn't a difference between a high-quality DAC and a cheap consumer one! Far from it. It is just that the differences typically lie in features (amount of buffer, accuracy of timing clocks, etc) on the digital side, and on the
ANALOG side (post-decode). And, it is entirely possible that different DACs will react to "unexpected conditions" in unpredictable ways post-decode. I can, at least, imagine that some very high levels of jitter and/or latency could conceivably cause problems with some specialized DACs that expect a pristine input signal (I'd call that poor design, but who knows). It is also possible that some DACs might react differently on the analog side to types of connections on the input side (even though the bits are the same, it might process the SPDIF input differently than it does the USB input or HDMI input). They probably shouldn't (the same digital bits coming in, once decoded, should route to the same exact system for output), but who knows. I can imagine that DACs probably have complex logic to work around many of the "errors" that are common on crappier input devices (improperly formatted streams and whatnot). This logic introduces complexity that could conceivably "color" the analog output side, I suppose (again, I'd blame the DAC though).
But, for a digital input bitstream sent over a digital serial bus, either the information gets there
perfectly or it doesn't get there at all. There is no in-between like there is with an analog signal.