This reminds me of the Monty Python sketch about having an argument; a plain contradiction is not the same.
I don't know if you are just contradicting me just for the fun of it (or because you need the energy, vampire like) or just to see how I react.
This my final word about this.
Your point is
ridiculous.
Why record in 24 bit/96 K or higher if it sounds the same as 16/44?
A ruse to take more space in your hard drive?
The question of why sample rates have gone up as high as 196 is an interesting one, because there's surely no technical reason why they ever need to go much past 60 kHz, worst case. That;s not just my opinion, it's the opinion of some high powered ADC designers and experts on human hearing.
For bit depths, there are good reasons for recording and producing at 24 bits, particularly of live events...but a properly done 16 bit and 24 bit transfer of a source like the 'Fragile' analog master tape shouldn't sound different. 16 bits is giving us 96 dB of 'resolution', which exceed not only what old analog recordings like this have to offer, but also what even very good home listening environments require. 24 bits gives plenty of 'insurance', and even makes some sense as a delivery format nowadays when signals are often streamed right into complex DSP chains in receivers, which would convert 16 to 24 anyway. But *intrinsic* difference, like 'more presence' just by virtue of being at 24 versus 16? Nah. There's no experimental support for that.
So am I saying that 16/44 should be 'good enough' if properly done, that the difference you hear in remasters is overwhelmingly due to the remastering EQ/levels, and that the 'high rez' formats in home delivery media have a significant element of hype to them, and that the industry has no moral compunction about hyping 'specs' with little or no practical significance when it suits them for reselling same product over and over?
Heh.
With this same philosophy 30ips sounds the same as 15!(yeah right)
What 'philosophy' are you
imagining that I am promoting? That
nothing makes a difference? That's not the case.
Sometimes, a process likely makes a real audible difference. Sometimes not. It's not magic...there are well-founded technical reasons why one could expect 15 vs 30ips to make a real audible difference. That doesn't mean every measured difference, makes an audible difference.
I'm a recording engineer(back when it was 24 tr 2"), and although I'm not taking a holier than thou attitude,
Of course not. You're merely saying that whatever effect you think you hear,
must be real, it
must be happening for the reason you say it is.
I can hear a huge difference when I record in 24 or higher than in 16.
There are a couple of good technical reasons to
record and
produce at 24 bits, and do digital processing at 24 bits. They don't have to do with 'presence'. They have to do with allowing enough headroom (esp for a live recording), and with dealing with accumulated digital errors during digital editing/mastering/processing, that could become audible if it were done at 16 bits. That's all.
So, sure, if you overloaded your 16 bit format during recording, or did a ton of digital processing on it afterwards, it could sound much worse than the same recording kept at 24 bits. But I have a feeling your aren't talking about that situation. If you record and edit at 24, then properly convert to 16 (with dither), the only difference you should hear would be if you took the very quietest part of the recording, and listened to it at level that would be earsplitting during the loud parts.
So this 'huge difference' you routinely hear, I gotta wonder, where does it come from? There's no technical reason for it to really exist between properly done 16 and 24 bit releases. It's a prime candidate for an ABX comparison, which would rule out the usual psychological biases (one of which is 'better' numbers must mean better sound!)
A voice from a Neumann thru a Focusrite in 24 is no comaparison to 16 bit - if you can't face this....go back to cassettes.
You seem fond of 'excluded middle' argument...if I disagree that 16 vs 24 should make a huge routine difference, therefore I'd be happy with cassette-quality sound?
It's also interesting that you're comparing live recording, to transferring a 1971-era analog tape with an inherent dynamic range of...care to guess?
@:
Again, this thread is about a 1971 analog tape recording, transferred at 24 bits, EQ'd and compressed to a fare-thee-well, and even remixed (for surround) -- and you're claiming you hear the 'added presence' that's
specifically due to the extra 8 bits?
'Pros' aren't immune to normal psychological effects. So, take one of your 24 bit recordings (or rip a track from the Fragile DVD-A...it can be done). Dither it down to 16. Run an ABX comparison at normal listening levels and see if the 'presence' goes away. By your logic, it should. Therefore you should pass that ABX with flying colors.
And btw, I'm not 'looking for an argument'. I'm looking to counter a persistent and, to my mind, pernicious 'audiophile' mindset that puts maximum faith in an inherently flawed 'method' of comparing. It leads directly to things like 'intelligent pebbles', 'cable lifters', green markers on CDS, and other snake oil that user SWEAR makes a 'huge difference'.