Sorry I was not precise enough.
My initial questions came about because I thought MC might be allocating a fixed amount of memory at startup and filling that on an on-going basis. Or maybe allocating memory for a given album or playlist and changing that allocation for different albums and playlists. That does not appear to be the case. MC does not try to determine how much physical memory is available (up to 1G now) and load multiple tracks to fill that memory.
Here is a simple explanation of what is going on, as I now understand it. If I am incorrect, please let me know.
MC loads a single track into memory at the beginning of each track. It appears to just dynamically allocate and de-allocate memory (as C++ normally does) as needed between tracks. It does not load multiple tracks, even if there is enough memory available to do that.
If the track is larger than it can allocate, it loads what it can and reloads the rest of the track when necessary. Since I do not have a track large enough to fill the new 1 G limit, I cannot test how it handles a large track, but it is also not relevant for the vast majority of tracks.
MC does not set a fixed block of memory aside and keep that filled. That would be an option if you wanted, for example, to pre-allocate memory to be sure you have it available. In that case, you could allocate a large amount of memory and pre-load all the tracks on most albums, for example, and not have to do I/O between tracks. It appears MC has elected not to do that.
Given that the memory allocation is simply the size of a track, there is no real need for MC to show how much memory is being allocated (it is just the track size -other than for extremely long tracks).
One practical conclusion from this is that people with memory constrained systems should not rush out to get more memory because of the new 1G allocation, unless they have extremely large tracks.
glynor - I am just trying to understand the process. You may find that strange, but I see nothing wrong with being curious and trying to be well informed.