Also, if I remember correctly, a CD can theoretically hold 2-3GB of data, but that's reduced down to 700MB because of the error correction...
Okay, so...we're not talking about regular CD's in this thread, and the study is talking about how confining high-resolution media to a CD standard nutralises the benefits. I don't see how that helps in this instance.Anyone interested in whether we can perceive anything past CD standard audio quality should read and comprehend this article. It lays out very well why any 'benefit' 99% of listeners hear is placebo. [this is especially true for any formats that claim one CD quality format is superior to another CD quality format]
This topic has been beaten to death on countless music forums and social media. When in doubt, trust in the scientific method/process until proven otherwise.
It is a perfectly legitimate question!No it's not; it's a perfectly legitimate question.
But you don't seem to want to address it; you seem to just want to talk about DAC's. My question, to boil it down to it's most basic, is whether a standard CD laser is a weak link in the system and whether eliminating it in favour of a lossless digital reproduction might produce a higher quality output.It is a perfectly legitimate question!
They "degraded" high resolution material to CD quality and then let listeners find out which is the real high res material and which is the degraded one. And most listeners, including trained audio engineers, did not have a significantly higher hitrate than 50%, which means they do not hear it better than someone who is just guessing.I'm assuming they didn't degrade the sound of the various types of CD they were testing, just the SACD & DVD-A.
when the best supporting argument on both sides is "just trust us"
In that case you would not need to compare, because the data would be bitwise identical. Or maybe you meant that.One good comparison would be to find one of these SHM-CDs that has the EXACT same master as a normal CD and then compare.
You can believe whatever you want. But that does not change anything on the science. Best is to find out yourself. We did that, and now we gave you the hints how to do that yourself. If you prefer to believe further, that is your choice.why should I believe a random person posting anonymously to a forum rather than trusting the manufacturer's claims?
That's a fair conclusion, I was hoping to conjure some extrapolation of the same idea directly towards any vaporous 'audiophile' claim (of which there are too many to list here). Most of the digital technology we use in the audio-verse has been fleshed out by this point 50 years on, most of the "but what about" questions and issues have been addressed ad nauseum. The root source for these questions inevitably always stem from hardware manufacturers and the various audiophile media companies wanting to sell a product for more money. If there's no audible difference for that 99% of the human race between CD quality and higher resolution, then there's certainly less between CD quality formats as you presented them in your OP. It's not a giant leap to make. If anything, it seems like an obvious conclusion.Okay, so...we're not talking about regular CD's in this thread, and the study is talking about how confining high-resolution media to a CD standard nutralises the benefits. I don't see how that helps in this instance.
I'm not looking for validation or a fight; just a reasonable and civil response.If you're just looking for validation of your conclusions, it doesn't seem like you're going to find it here. Maybe there's a forum with members drinking the right flavor of Kool Aid out there.
That seems about right:Also, if I remember correctly, a CD can theoretically hold 2-3GB of data, but that's reduced down to 700MB because of the error correction...
The 588 bits include 192 bits audio data, 64 bits error correction data, 8 bits subcode data, and the rest is padding and merging bits created through the specific encoding of how it is written on CD to counter read errors.a frame ends up containing 588 bits of "channel data" (which are decoded to only 192 bits of music).
- https://en.wikipedia.org/wiki/Compact_Disc_Digital_Audio
As said before, that difference is in the mastering then and could be achieved with a regular CD as well, or it is in your head only. There is no alternative explanation.I'm pretty sure there is a difference, at least on my system, even if just an extremely marginal one.
Having ripped thousands of CDs and in the process encountering many that either:...essentially, my question is this: do you get an uplift in audio quality by ripping a file from a disc?
And just to be clear: I'm talking exclusively about CDs. Due to the much more robust error correction built into DVDs and Blu-rays, I don't think the same variables apply.But having said that, in the absence of any serious issues, it seems extremely unlikely that the difference--which almost certainly exists on paper--would be audible.
They "degraded" high resolution material to CD quality and then let listeners find out which is the real high res material and which is the degraded one. And most listeners, including trained audio engineers, did not have a significantly higher hitrate than 50%, which means they do not hear it better than someone who is just guessing.
A NOTE ON HIGH-RESOLUTION
RECORDINGS
Though our tests failed to substantiate the claimed advantages of high-resolution encoding for two-channel audio, one trend became obvious very quickly and held up
throughout our testing: virtually all of the SACD and
DVD-A recordings sounded better than most CDs—
sometimes much better. Had we not “degraded” the sound
to CD quality and blind-tested for audible differences, we
would have been tempted to ascribe this sonic superiority
to the recording processes used to make them.
Plausible reasons for the remarkable sound quality of
these recordings emerged in discussions with some of the
engineers currently working on such projects. This portionof the business is a niche market in which the end users are
preselected, both for their aural acuity and for their willingness to buy expensive equipment, set it up correctly,
and listen carefully in a low-noise environment.
Partly because these recordings have not captured a
large portion of the consumer market for music, engineers
and producers are being given the freedom to produce
recordings that sound as good as they can make them,
without having to compress or equalize the signal to suit
lesser systems and casual listening conditions. These recordings seem to have been made with great care and
manifest affection, by engineers trying to please themselves and their peers. They sound like it, label after label.
High-resolution audio discs do not have the overwhelming
majority of the program material crammed into the top 20
(or even 10) dB of the available dynamic range, as so
many CDs today do.
Our test results indicate that all of these recordings
could be released on conventional CDs with no audible
difference. They would not, however, find such a reliable
conduit to the homes of those with the systems and listening habits to appreciate them. The secret, for two-channel
recordings at least, seems to lie not in the high-bit recording but in the high-bit market.
It's not impossible that it's cognitive bias on my part, if so, it's subconscious, as I did confuse the discs and ended up thinking I'd fooled myself, before realising that I'd just got the discs mixed up...As said before, that difference is in the mastering then and could be achieved with a regular CD as well, or it is in your head only. There is no alternative explanation.
That is your right to do, but then please don't ask us about lasers hitting surfaces and bits coming out of it, because that has nothing to do with belief, and we cannot argue for or against any beliefs.I don't believe it was.
You are making things up here. There is no difference in bitrate which is introduced by the laser. The bitrate is determined by the digital data. Whether you read it via laser from optical storage or magnetically from a hard disc or electrically from solid state storage makes no (NO!!!) difference. The data is the same. There is no noise being introduced by the reading of the data. Noise can be introduced in the analog-to-digital (ADC) or digital-to-analog conversion (DAC), which happens before writing to or after reading from the data store respectively.difference in bit-rate/data-transfer-rate between a file read from a CD via a laser and a file read from a lossless digital source
Not sure what different words to try to use. You either successfully read a digital dataset or not. There's no 3rd state. I'm not sure if you are intentionally disagreeing that digital audio storage is a sequence of ones and zeros. Yeah, I thought that was on the face of it... Anyone got a link for the original documents on pcm audio with 16 bit samples at 44.1k samples per second?But you don't seem to want to address it; you seem to just want to talk about DAC's. My question, to boil it down to it's most basic, is whether a standard CD laser is a weak link in the system and whether eliminating it in favour of a lossless digital reproduction might produce a higher quality output.
Maybe if we reframe it...how about this: why should I believe a random person posting anonymously to a forum rather than trusting the manufacturer's claims?
You've offered me an argument from authority, which basically boils down to "just trust me". Why should I believe either of you?
We're not listening to the data reading utility. We're listening to the already read data fed through a DAC. The data transfer part - slow/fast lasers and all - either happens successfully or not. If the system is too slow to keep up with playback, you get dropouts or the sounds of error correction trying to fill dropouts. Because the transfer failed!https://cdn2.imagearchive.com/quadr...bility-of-a-CD-Standard-v-High-Resolution.pdf
@stoopid
Okay, so...let me know if I've got this right...or not...up to you.
I've been re-reading it and it seems in relation to the central question of this thread; the article is relevant in terms of the bit-rate?
The difference in bit-rate/data-transfer-rate between a file read from a CD via a laser and a file read from a lossless digital source (assuming there is (for sake of argument) a difference, albeit very small) is negligible in terms of the noise floor of the experiment detailed in this paper...
...which would suggest that even if there is a difference in data-read speed or completeness of signal when using a laser, it's indeterminable in practical terms; or in other words: no one can realistically hear a difference.
That's it, right? Have I got it?
Edit: ...okay...now I want validation. Did I get it right?
Not really...I didn't think this was fun at all, but thanks for the advice re. trying different masters and good luck to you.Well, this was fun! You're searching for some magic to find. I do understand! But I'm out. Good luck!
(Still telling you it's differently mastered editions to chase though.)
Enter your email address to join: