The recording/production/mastering is the number one certain ingredient in a wonderful sounding recording.
Absolutely.
No degree of resolution can help a poor recording
Yes… but it's also important for people to understand that
sample rate is not resolution.Listening for bit depth/resolution requires some training/practice. I now can generally tell any file at 256KB or lower, but at 16/44.1 (and I include lossless flac here) and 320KB it is not as easy...I think I can tell may 50% of the time through my home speakers and maybe 70% of the time through headphones but a wonderfully engineered recording blurs the difference. Listening in a car with just about any (including pricey optional) stereos I don't believe you would ever tell the difference between a CD and 320KB file. Further, higher resolution files in a car makes no sense to me. It's just too noisy
You are talking about bit-rate in lossy compression here, not bit-depth.
Bit-rate reduction via lossy compression - especially with a codec like MP3 - can be audible. Even high bit-rate MP3 has artifacts which are inherent to the codec - which is why it should not be used any more. AAC produced using a good encoder (I use Apple's via QAAC) is far superior.
I can pass a 320k MP3 vs FLAC blind test with 100% certainty. I haven't done an ABX test for these yet, but I use unrestricted VBR AAC encoding for portable devices, which tends to produce files about 1/3 to 1/2 the size of the equivalent FLAC, and I've yet to find a track where I've noticed the compression. I still keep the FLAC originals on my media server though, where storage is not a concern.
Hi res files at 24/96 do not seems to distinguish themselves very much from 16/44.1. 24/192 do distinguish themselves more often, but NOT consistently, nor most of the time. My guess is they are noticeably better ~40% of the time....again for what I sampled which was certainly not chosen with any criteria as to what might have mattered (like age of recording, newly remastered...etc..... what I listened to/tested was what I came across and had interest in). When hi res was noticeably better to my ears, was it because of remastering? Source material? Resolution? I would think if it was the resolution, I'd hear a noticeable improvement more often than I did.
If it sounds different (better) it's almost certainly sourced from a different master. You need to compare 16-bit 44.1kHz tracks produced from the same "high-res" source.
I once read (it was a while ago) that bit depth and not the resolution represents the biggest improvement in hi res audio. I didn't hear that or at least, can't draw that conclusion.
Well bit-depth is probably the closest thing to "resolution" in a PCM file. But if the track is dithered, literally the only difference bit-depth makes is how quiet the noise floor will be.
With 16-bit audio, the noise floor should be inaudible in most setups to begin with. If it does become audible, it's likely that your system is loud enough to cause hearing damage within minutes.
Note: this is different from the playback device's bit-depth.
If you are using digital volume control, particularly if your device is paired with an amplifier that has too much gain, it's possible that there will be a clear difference in noise between a 16-bit and 24-bit DAC.
But that is a completely separate issue from the bit-depth of the file being played.
One question for the learned people on this topic about the higher file size of hi res files: Does ALL of that additional file size fall below and above the human hearing range? If not it's quite easy to conclude it adds to the depth of the recording. If so, I'm inclined to believe what improvements I hear are NOT attributable to the bit depth or resolution.
Yes.
One of the things a lot of people incorrectly assume is that sample rate can affect the "timing" of a signal, but that is not how digital audio works.
I recommend watching the whole video to get a basic understanding of how digital audio works, but here is a short demo showing how sample rate does not place restrictions on timing:
https://www.youtube.com/watch?v=cIQ9IXSUzuM#t=1254 High bit-depths and sample rates
do matter for music production.
If you are recording audio at 44.1kHz, which can capture audio frequencies up to 22.05kHz, and something in the room emits a higher frequency noise at say 30kHz like some of the examples posted above, that will cause aliasing if it is picked up by the microphone rather than being filtered out.
The aliasing from that 30kHz noise would then be folded back into the signal and cause distortion in the audible range (if I'm not mistaken, it would be at ~14.1kHz).
Recording at 96kHz can capture all frequencies up to 48kHz without aliasing.
That 30kHz noise would be present in the recording, but would not cause aliasing and distortion inside the audible range since it is a valid signal for a 96kHz recording.
You can then produce a 44.1kHz track from that 96kHz recording which has everything above ~21kHz filtered out digitally so that there is no aliasing.
It's a similar situation with bit-depth.
If you record at 16-bit, and something plays too loud, it may blow out the sound and cause it to clip. Clipping is a very audible and ugly distortion - a loud crackling noise.
If you record at 24-bit you have an additional 8-bits of headroom, or about 48dB. Now that very loud noise will not cause clipping distortion in the recording.
The actual dynamic range of a mastered music track is nothing close to the ~96dB of 16-bit audio, so there's no need to use 24-bit for a delivery/playback format, but it matters for recording because it helps prevent clipping distortion.
But people saw that studios were using higher bit-depths and sample rates than were being delivered to us as a playback format and basically thought they were holding back something from us.
And someone else came along and saw the potential for selling this as "high resolution" music they could charge more for.