Technical question re. SHM-CD, Blu-spec CD & Digital Files

QuadraphonicQuad

Help Support QuadraphonicQuad:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
I have high confidence that some people can hear things I can’t, and probably never could. My wife has a color sense that drives me nuts, talking about subtleties that I simply cannot perceive.

So is there an audible difference? I doubt it with high confidence. Every digital storage medium is inherently imperfect. That’s why error correction was developed, and from what I know, it’s different for different media. So a ripped file stored on a hard drive will have a different error correction than the same file on a CD or a SSD or a floppy disc. RAIDs use an error correction scheme that is specific to that format. And all those error correcxtion schemes are transparent to us end users, unless they fail.

If someone comes up with a truly better, more robust means of storing data on a pressed CD, all that would mean is that the error correction gets called on less frequently. Which would be transparent to us. We could write a program that tells us when the error correction kicks in. I suspect most of us would be amazed at how often it is needed, but I don’t have any data.

I haven’t done any tests myself with blu-spec CDs or SHM CDs. But if they conform to the red book, I’d be surprised if there were audible differences. I can be proven wrong, of course, but I’m pretty confident that I won’t be.
 
I have high confidence that some people can hear things I can’t, and probably never could. My wife has a color sense that drives me nuts, talking about subtleties that I simply cannot perceive.

So is there an audible difference? I doubt it with high confidence. Every digital storage medium is inherently imperfect. That’s why error correction was developed, and from what I know, it’s different for different media. So a ripped file stored on a hard drive will have a different error correction than the same file on a CD or a SSD or a floppy disc. RAIDs use an error correction scheme that is specific to that format. And all those error correcxtion schemes are transparent to us end users, unless they fail.

If someone comes up with a truly better, more robust means of storing data on a pressed CD, all that would mean is that the error correction gets called on less frequently. Which would be transparent to us. We could write a program that tells us when the error correction kicks in. I suspect most of us would be amazed at how often it is needed, but I don’t have any data.

I haven’t done any tests myself with blu-spec CDs or SHM CDs. But if they conform to the red book, I’d be surprised if there were audible differences. I can be proven wrong, of course, but I’m pretty confident that I won’t be.
It's been a while since I've ripped a disc but I seem to remember the software I was using giving me an accuracy rating for the rip...

...not sure if re-ripping it would give a different accuracy rating, but I think it's probably safe to assume that whatever aberrations (so to speak) that there might be, probably don't really make a discernable difference to the output.

With the three mountain discs I have (CD, Blu-spec CD and Blu-spec CD2) I don't think I'd be able to tell a difference between them unless I could listen to them back to back. I don't think there's enough of a difference between them for people to notice unless they're concentrating on trying to identify a very subtle, granular difference in quality, but, it wasn't a truly a blind comparison and not control conditions: I don't know if, or by how much, my particular setup is influencing the sound I'm getting from the discs...and those three CD's are the only comparison I've made; I couldn't say how things might differ if you made a comparison of other albums on that format.

General consensus on the topic seems to suggest that Blu-spec CD's and SHM-CD's are not really any better than regular CD's, although I have read that SHM-CD's generally carry more carefully crafted masters, but that's just hearsay; I've no idea if it's true.
 
Every digital storage medium is inherently imperfect. That’s why error correction was developed,
The essence for most digital storage is the "correction" part. Because if only one bit flips, your program or data is unusable. This is a little different with audio CDs which are being played for immediate DA conversion. If one bit flips there, you will probably not hear that in the audio signal. So some slack can be cut with the error correction on audio CDs.

Still we do know that perfect retrieval of the data from audio CDs is possible, as supported by comparing multiple rips from multiple sources (-> AccuRip). So I think we do not really have to think about that anymore, and as laid out before: the absence of broken data does not contribute to the overall audio "quality", but to the absence of glitches.
 
It's been a while since I've ripped a disc but I seem to remember the software I was using giving me an accuracy rating for the rip...

...not sure if re-ripping it would give a different accuracy rating, but I think it's probably safe to assume that whatever aberrations (so to speak) that there might be, probably don't really make a discernable difference to the output.

With the three mountain discs I have (CD, Blu-spec CD and Blu-spec CD2) I don't think I'd be able to tell a difference between them unless I could listen to them back to back. I don't think there's enough of a difference between them for people to notice unless they're concentrating on trying to identify a very subtle, granular difference in quality, but, it wasn't a truly a blind comparison and not control conditions: I don't know if, or by how much, my particular setup is influencing the sound I'm getting from the discs...and those three CD's are the only comparison I've made; I couldn't say how things might differ if you made a comparison of other albums on that format.

General consensus on the topic seems to suggest that Blu-spec CD's and SHM-CD's are not really any better than regular CD's, although I have read that SHM-CD's generally carry more carefully crafted masters, but that's just hearsay; I've no idea if it's true.
With ripped discs (assuming you rip to WAV) you can at least compare file sizes. Back in the DOS days, I recall there being a file comparison utility that would examine two files and tell you the differences, although if they were different sizes, that’s all it would tell you.

It makes sense that a producer of “better than CD” quality releases would be more careful with mixing and mastering, but that’s no guarantee that the decisions were in agreement with your or my tastes.
 
The essence for most digital storage is the "correction" part. Because if only one bit flips, your program or data is unusable. This is a little different with audio CDs which are being played for immediate DA conversion. If one bit flips there, you will probably not hear that in the audio signal. So some slack can be cut with the error correction on audio CDs.

Still we do know that perfect retrieval of the data from audio CDs is possible, as supported by comparing multiple rips from multiple sources (-> AccuRip). So I think we do not really have to think about that anymore, and as laid out before: the absence of broken data does not contribute to the overall audio "quality", but to the absence of glitches.
Well, that depends on which bit got flipped. If it’s 14, 15, or 16, yeah, I doubt if we could tell, but if it’s 1 or 2, I’m pretty sure (especially if it went from 0 to 1] that it would be audible. DSD is another story, since all that’s actually recorded is LSB.

But as long as the disc is in decent shape, error correction does its job, and we are oblivious.

I have a few “mix tape” CDs that I burned for the car. They are about ten years old and full of skips and stutters.
 
The failure scenarios. When something goes wrong and data is lost, the error correction designed in does what it can. We can talk about the sound quality impact from listening to a dropout riddled file and the artifacts in the audio caused by the attempts of the system to correct for errors.

At that point things are broken! But we can critique how well the correction attempts work. I'm being dismissive of the error scenarios. Yes absolutely, this is a crafty thing to fall back on when disaster hits! Being analytical and drawing a line between error free data or not, audio samples are audio samples. The only conversation we can have about the sound is more about the DAC being used to listen to the audio data.

We can certainly talk about the failure scenarios and how pleasing the remaining sound is. I don't mean to just dismiss this. It's just not fair to talk about it as the sound of a format. Trying to talk about how the car handles driving down the road, not crashing into a wall! (Today's stupid car analogy attempt.)
 
It makes sense that a producer of “better than CD” quality releases would be more careful with mixing and mastering, but that’s no guarantee that the decisions were in agreement with your or my tastes.
The paper posted above seems to suggest that to be the case (in the note at the end on HD media) but to be honest, with my current setup, across those three discs specifically, the difference is so minimal it's barely discernable; not what you might expect from something that you'd think would be a clear improvement.

The authors of the study linked to above, seem to think that there is a noticeable difference with DVD-A & SACD that - after discussion with sound engineers - they attributed to more care being taken with the mastering. They note that it should be possible to get the same type of quality from a CD.

I just received Asterix's 1970 self-titled album on SHM-CD and have two other alternate pressings of that CD on their way to me, so I'm going to see if there's a noticeable difference between those...at the very least I might be able to report that the mastering is better on the SHM-CD, but who knows.

I was thinking of actually trying to contribute to the https://dr.loudness-war.info/ site. A lot of the stuff I'm interested in just isn't on there, so I'm probably going to have to pick up the different pressings for comparison.
 
Last edited:
The paper posted above seems to suggest that
The paper is 17 years old, so I would not derive a rule on non-technical aspects of current audio productions from that...
I was thinking of actually trying to contribute to the https://dr.loudness-war.info/ site.
I do regularly consult that site when looking for older pre loudness war editions. But the data quality there is mixed at best. Also note that good DR values say almost nothing about the actual production quality.
 
Comparing different releases of an album with no other variables is an eye opening exercise if you haven't ever done this.

Variables ranging from simply different volume levels (louder always sounds better) to different equipment, different DACs, etc.

Pick a favorite older release. Something that has 6 or 7 CD reissues and a 24 bit HD release or two. Rip the files to wav. Put each version on it's own track in your favorite DAW app.
(Audacity is free. Reaper can be demo'd for free.)

You'll already spot the brick wall limited and louder files by sight and you'll hear some things right away... But start with matching levels across all the versions. Turn the louder ones down. Leave the quietest one at 0db (unity). If there are differences between lows and highs so much that you have to pick one or the other to match volume by, go for overall presence.

Now start A/B comparing the different versions!
This is all being listened to through the same DACs. You've normalized for the overall volume level to avoid picking the louder as better. You're hearing the mastering decisions and work done.

Technical:
If you have different sample rates across versions, you can upsample them all to 96k with SOX or r8brain first.
(Don't trust upsampling? Run a test of 100 iterations of back and forth and see for yourself!)
If you use Reaper, you can have it resample on the fly when you mix different sample rate audio in a project. Reaper can use r8brain for this and it's fully transparent.

'Exclusive solo' is usually a good way to A/B. Option-command-click a solo button in Reaper and it un-solos previously solo'd tracks. This lets you instantly switch between versions with no distracting click or gap in the audio.

You'll usually find a few dueling CD versions that are identical. Null between them to verify that quickly.

Kind of an eye opener experiment if you haven't done this or don't realize.

Here's another homework assignment:
Convert a 24 bit 96k file of a song to 44.1k and then from 24 bit to 16 bit. Now convert it back to 24 bit and 96k (so you can compare it against the original). Use r8brain or SOX. (The free XLD includes SOX.)
Hear anything? It's not suddenly volume war loud or shrill sounding is it!
 
It's easy enough to amass a collection of a few hundred CDs where every last one of them is volume war mastered. So one concludes the CD format has that fidelity limitation. That's all.

It takes a little effort to work through some of this (and while certain industry players are gaslighting you along the way).

I just tried to listen to the new Deep Purple album. I grabbed the 24 bit 48k master edition. The 'high falootin' copy, right? -5.4 LUFS! Not a typo. I had to pull up a meter just to see. -5.4 LUFS! Fully unlistenable. They could have released this in 8 bit 32k. But they put this novelty destruction to 24 bit for some reason. This volume war stuff is still going on. CD format might be the calling card for volume war mastering but 24 bit downloads and blurays fall victim often enough too.

Did someone just completely inept get hired at a major studio? And there was no quality control? (So the QC people too?) Or was this an intentional novelty release with intentional lo-fi sound? Either answer sounds insane to suggest... but it's one of those two things.
 
Last edited:
Comparing different releases of an album with no other variables is an eye opening exercise if you haven't ever done this.

Variables ranging from simply different volume levels (louder always sounds better) to different equipment, different DACs, etc.

Pick a favorite older release. Something that has 6 or 7 CD reissues and a 24 bit HD release or two. Rip the files to wav. Put each version on it's own track in your favorite DAW app.
(Audacity is free. Reaper can be demo'd for free.)

You'll already spot the brick wall limited and louder files by sight and you'll hear some things right away... But start with matching levels across all the versions. Turn the louder ones down. Leave the quietest one at 0db (unity). If there are differences between lows and highs so much that you have to pick one or the other to match volume by, go for overall presence.

Now start A/B comparing the different versions!
This is all being listened to through the same DACs. You've normalized for the overall volume level to avoid picking the louder as better. You're hearing the mastering decisions and work done.

Technical:
If you have different sample rates across versions, you can upsample them all to 96k with SOX or r8brain first.
(Don't trust upsampling? Run a test of 100 iterations of back and forth and see for yourself!)
If you use Reaper, you can have it resample on the fly when you mix different sample rate audio in a project. Reaper can use r8brain for this and it's fully transparent.

'Exclusive solo' is usually a good way to A/B. Option-command-click a solo button in Reaper and it un-solos previously solo'd tracks. This lets you instantly switch between versions with no distracting click or gap in the audio.

You'll usually find a few dueling CD versions that are identical. Null between them to verify that quickly.

Kind of an eye opener experiment if you haven't done this or don't realize.

Here's another homework assignment:
Convert a 24 bit 96k file of a song to 44.1k and then from 24 bit to 16 bit. Now convert it back to 24 bit and 96k (so you can compare it against the original). Use r8brain or SOX. (The free XLD includes SOX.)
Hear anything? It's not suddenly volume war loud or shrill sounding is it!
I appreciate the advice, but my setup isn't nearly as technical, and my P.C. isn't hooked up to my receiver?

For me it comes down to how soft/clear the treble is, if that makes sense?

With the speakers I have, I'm limited to how high I can turn up the volume before the treble becomes too harsh and distorted, so really, for me, that's the limiting factor.

I have found however, that the higher quality the release, then the less the treble is a problem: Most of my DVD-A and Blu-ray Audio I can turn up as high as I need with no problem, but not with the loudness war CD's.

With the really tinny, shrill, releases I've found that switching to DTS Neo:6 injects a lot more bass into the sound and seems to give it all a lot more body, if you know what I mean? It thickens out the sound and seems to open up the mix...?...so it's that I'm using almost exclusively now for CD's.

I'm not sure how much my speakers affect the sound; as I say, they don't have a problem with the high-res audio treble, they handle it really nicely but I'm left wondering whether more expensive speakers might help with the loudness war problem.
 
Speakers affect the sound more than any other component in a sound system. Biggest potential for variance. Literally what is recorded into a file is the absolute biggest variable. Different masterings, that is. But just as striking as different music sometimes. Grab an eq knob and make a move. You hear that on everything! Phone speaker. 2" TV speaker. Volume war and treble hyped mastering (those usually go together) is very aggressively altering and eclipses any format limitation and most hardware limitations. ie. Sounds bad on everything from Amazon cheap to audiophile boutique.

Put your digital media to an audio interface or AVR with a computer and you can investigate any of this. As well as have access to all the formats and in their fullest quality. HTPC is really the way to go. You'll discover that all these digital formats offer full audio when used properly and all of them can just as easily have garbage put to them. Get "better seats" for your favorite music!

Just a volume difference of 0.5db makes us choose the louder as sounding better with the very same file. Trying to A/B with different hardware media players is nearly impossible. If the differences in the audio content are subtle and possibly within perception bias to begin with, then it's fully impossible to A/B unless you are very accurately level matched and no clicks/chirps/gaps in the sound when you click the button to A/B.
 
The failure scenarios. When something goes wrong and data is lost, the error correction designed in does what it can. We can talk about the sound quality impact from listening to a dropout riddled file and the artifacts in the audio caused by the attempts of the system to correct for errors.

At that point things are broken! But we can critique how well the correction attempts work. I'm being dismissive of the error scenarios. Yes absolutely, this is a crafty thing to fall back on when disaster hits! Being analytical and drawing a line between error free data or not, audio samples are audio samples. The only conversation we can have about the sound is more about the DAC being used to listen to the audio data.

We can certainly talk about the failure scenarios and how pleasing the remaining sound is. I don't mean to just dismiss this. It's just not fair to talk about it as the sound of a format. Trying to talk about how the car handles driving down the road, not crashing into a wall! (Today's stupid car analogy attempt.)
If the error recovery system is inadequate for perfect recovery, then, yes, something is broken. But, just as analog never reproduces perfectly (even if pleasantly), the designers of digital devices, particularly storage devices, understand that with the trillions of bits that are stored, there will be at least a few mistakes, and that is why there is error recovery. My earliest experience with that was the parity bit in RS232 communications, which would make the total number of 1s in a data word be either odd or even, depending on what the setting was. If the word had an odd number of 1s and the parity was set for even, then parity would be a 1, making the total number of 1s in the word be an even number. It wasn't so much correction as detection, but at least then you would know you had bad data. I designed a small data capture device that would grab a four-digit BCD number from an ATM, which checked to make sure the 16 bits held four BCD numbers. If so, it would trigger a response from the device. TCP/IP can cause retransmission of packets that don't pass a checksum test, which is given to every packet on the Internet.

But, as we noted earlier, error correction schemes are intended to be transparent to the user. That may or may not be a good thing, as a deteriorating medium might be replaced before total failure once the errors became too great for the correction scheme.
 
I guess what I'm saying is it's worth it to jump down a layer with this stuff. We can say "These two devices sound different." And that's not incorrect! If we jump in a layer down... Oh, I've got two systems here! Data delivery and DAC. The 2nd one sure isn't going to work if the 1st one fails. Oh shoot, we were trying to critique the DAC performance but the problem is the thing not getting the data!

Detection alone is still welcome! One thing that digital audio led to is the ability to literally subtract one file dataset from another. And matter of fact, if all the ones and zeros are precisely identical (and only if they are all identical) the difference result is strictly zeros. Perfect null. You can only do this with a digital recording. Analog has ever changing noise floor. The digitized file is a snapshot, original noise floor and all.

Partial null results can still be useful and interpreted. Two audio files that sound alike at a glance (quick A/B by ear) but null down to only say -90 or -100db can be assumed to originally be from the same source but one has an altered noise floor vs the other. Either an analog generation or a digital sample rate conversion. But I'm getting ahead of things and back into the subjective.

Deteriorating media in the digital realm is indeed insidious!
User experience:
Perfect read.
Perfect read.
Perfect read.
Perfect read.
Perfect read.
Perfect read.
Error! Can't read.
The end.
 
I'm left wondering whether more expensive speakers might help with the loudness war problem.
About as well as a faster car would help with a speed limit sign problem! :D

Compression distortion like that is the most undo-able thing in audio, FYI. These are heavily distorted novelty releases and they can't be cleaned up. Not with the most impressive audio tools or most powerful computers.
 
I buy a lot of Japanese UHQCDs. Almost without exception they sound superior to the previously released "regular" Redbook cds. Comparing the UHQCD version of Yessongs with any previous version as played on my Oppo and/or turntable finds it sonically superior with no remastering claimed. Contrary to a claim above, they are not louder than their predecessors in my experience and not quieter either, just clearer.

The manufacturer claims the substance used allows superior resolution of data vs "regular" Redbook. For whatever reason, I find their results closer to SACD, not only in terms of clarity but often in soundstage presentation as well.

And before anyone tries to claim placebo effect, these are primarily releases I had on vinyl years ago and on cds as they have been released, remastered, etc over the years, so I am quite familiar with what they sound like.
 
I buy a lot of Japanese UHQCDs. Almost without exception they sound superior to the previously released "regular" Redbook cds. Comparing the UHQCD version of Yessongs with any previous version as played on my Oppo and/or turntable finds it sonically superior with no remastering claimed. Contrary to a claim above, they are not louder than their predecessors in my experience and not quieter either, just clearer.

The manufacturer claims the substance used allows superior resolution of data vs "regular" Redbook. For whatever reason, I find their results closer to SACD, not only in terms of clarity but often in soundstage presentation as well.

And before anyone tries to claim placebo effect, these are primarily releases I had on vinyl years ago and on cds as they have been released, remastered, etc over the years, so I am quite familiar with what they sound like.
Yessongs UHQCD/MQA-CD is infact a remaster, the same 2013 remaster done for the SHM-CD and SACD by Isao Kikuchi, according to Discogs. According to Dynamic Range Database, that version is significantly more compressed than previous versions. EDIT: It appears the DR reading on Dynamic Range Database is broken/bad. If someone with the SACD or CD could provide a proper reading that would be very cool.

Aside from the factor of remaster likely done from a better quality tape scan, there's the additional distortion introduced by the MQA encoding on the UHQCD compared to the respective SACD and SHM-CD with the same mastering.
 
Last edited:
Back
Top