Professionally, I'm a network engineer, specializing in wireless (802.11), so I can actually speak to that question.
When discussing LAN applications, one of the factors that we define is the "primary performance characteristic" (or, PPC) of the application. The PPC of an application is the single factor that most affects the user's perception of adequate performance. The two main PPCs are bandwidth and latency.
Bandwidth is the amount of data that can be moved per unit time. It is typically measured in bits per second or bytes per second (1 byte = 8 bits). Latency is the response time of the network. Latency can be further broken down into three sub-components: out-bound network latency, processing time, and in-bound network latency. Out-bound network latency is the time it takes a request to get from the "requesting station" to the "requested station". Processing time is the time it takes the "requested station" to think about the request and prepare the answer. In-bound network latency is the time it takes the answer to get back from the "requested station" to the "requesting station".
Consider a huge file download. For this type of application, bandwidth is the PPC. It doesn't matter if the request takes a long time to get from requester to requested station, because the requester can usually send multiple requests at a time. He doesn't have to wait for the answer to come back before sending the next request. Take this example:
Request 1-> | |
Request 2-> | |
Request 3-> | <-Response 1 |
Request 4-> | <-Response 2 |
... | ... |
Request 100-> | <-Response 98 |
| <-Response 99 |
| <-Response 100 |
Even though there is a larger delay between each request and its associated response, the net effect is very small compared to the total time of the file transfer, as long as bandiwdth is high.
Now, consider the case of streaming audio/video. Here, the PPC is latency, and more importantly, the consistency of the latency. Your media server is pumping data to your media player at a certain rate and your media player is buffering that data. If, for any reason, the data stream is interrupted, and the buffer empties, then you'll experience a skip.
What could cause this interruption? It's certainly true that in a low-bandwidth environment, such as dialup, this kind of interruption is more likely. The media server goes to send data and there's just no bandwidth left for it to do so. Skip! But once you've got adequate bandwidth for the data stream, adding more doesn't really help, because streaming media sends data at a more-or-less constant rate. If your stream needs 100 Kbps, and there is 4 Mbps available, it doesn't matter--the media server isn't going to ever send more than 100 Kbps!
Another cause of skipping (prevalent in the wireless environment) is corrupted frames. A data frame is sent, but something corrupts it on the way to the receiver so that it is lost. Typically, streaming media does not retransmit corrupted frames because the delay caused by retransmission would be worse than the skip caused by the lost frame. Because 802.11b and 802.11g networks use the very-crowded 2.4 GHz band and because RF signals in general are very succeptible to interference, corrupted frames are very likely even in the best wireless network. Notice that this type of skipping is completely independent of how much bandwidth you've got. You could have 1000 Gbps of bandwidth and a corrupted frame would still cause skipping, because streaming media typically doesn't retransmit corrupted frames. This is the most likely caue of the skipping you're describing.
Now, actually, it turns out that even though the
streaming media application won't retransmit a corrupted frames,
802.11 networks themselves have the capability to detect corruption and retransmit. But even when this happens, there is a significant additional delay imposed on packet transmission. This sudden delay can cause skipping in a video stream.
One last thing: before, I said that consistency of latency was the PPC for streaming media. The reason for this is that streaming media can compensate for high latency to some degree. If there is high latency between the server and the player, then there will be a large delay between the time when you hit "play" and the time the stream strats playing, but once it gets going, it will play fine. But if the latency suddenly changes (the network gets slower) then the server and player won't be able to compensate and you'll have a skip.