Guys - what got me confused and the reason why I started this thread was something I'd found on the FLAQ FAQ page:-
https://xiph.org/flac/faq.html"What is the lowest bitrate (or highest compression) achievable with FLAC?
With FLAC you do not specify a bitrate like with some lossy codecs. It's more like specifying a quality with Vorbis or MPC, except with FLAC
the quality is always "lossless" and the resulting bitrate is roughly proportional to the amount of information in the original signal.
You cannot control the bitrate much and the result can be from around 100% of the input rate (if you are encoding noise), down to almost 0 (encoding silence)."
Since I was comparing the output from successive rips of the same cd where the only change was the "compression level" I could not work out how the compression level could affect the "original signal" - hence the confusion.
This morning I did a bit of investigation into why some of the tracks had a very significant difference in file size and, by implication, bitrate.
I loaded the album that was ripped with the Compression Level set to 0 into playing now, set them playing then opened the DSP Studio Analyzer which provided me with what I believe to be a significant clue. Out of the 20 tracks on the cd/album 14 of them were very obviously Stereo recordings, but the other 6 were "mirror stereo" (CD's equivalent of Mono) initially the orange (volume) and the yellow (right channel) wave were visible, but the blue (left channel wave) was not; but I then noticed that the right channel wave had a green tint to it.
When I checked the file sizes for the Stereo tracks the Compression Level 0 files were around 10% larger than their Compression Level 6 equivalent but for the Mirror Stereo/Mono tracks the Compression Level 0 files were around 50% larger than their Compression Level 6 equivalent.