That was when digital signals usually went straight to a DAC, so yes the timing variations due to jitter would affect the output of the DAC. But that is not what happens with HDMI, especially to the audio. The HDMI input almost always goes into RAM these days, to then have video processing applied and to extract the audio and then send it on somewhere. Provided the HDMI jitter is low enough for the RAM copy at the receiving end to exactly match what the sender had, then it is a 100% accurate copy. If the data is clocked out again eg to a DAC or over another HDMI link, it will be with a new local clock that has nothing to do with the incoming HDMI clock. These systems are all basically computers inside now, and everything is buffered in RAM which is why we have lip sync issues to deal with. And if the HDMI jitter is high enough that the RAM copy is not 100% correct, well things will go rapidly downhill with that much jitter.
And by the way, "SQ" is the name of one of the 1970s legacy analogue quadraphonic matrix systems that encodes 4 channels in a stereo signal. Using the term SQ on here to mean Sound Quality will confuse people, because this is a quadraphonic site.
Generally I agree with you. I agree, and you will see that i agree and discuss this multiple times in a lot of forums besides here. The problem is I have seen a difference in picture and sound where I induced higher levels of jitter on a HDMI link. BUT thats just me and its subjective. However, I firmly believe SOMETHING is occurring. Your thinking in digital terms as I did at first. Then I jumped in with test equip and really LOOKED at this.
The REALLY freaky thing I have been involved in blind ABs in is with digital audio via ethernet and USB. Ethernet is the most mind bending technically. Digital audio delivered via ethernet is done so asynchronously. The music bits are modulated into a very complex, very analog, system that uses five Ethernet voltage levels and encodes two bits per clock cycle using four different voltage levels in each pair; the fifth voltage level is used for error correction. This is just the very surface of this modulation, but, its messy for sure. BUT on the rcv end it demodulates all that, does error correction and produces the audio bits. This is messy but has buffers and lots of corrections. After that chip the data is dumped into a large buffer.. A separate process then reads out the buffer with its own super clean clock. IE reclocked.. Then played.. BUT... Something REALLY freaky as at work here that I have NO CLUE how.. INSANELY Ethernet cables matter. WTF. Ethernet switches matter. WTF.. TONS of discussion on this and lots of blind AB.. I have done that AB a lot on really high end systems. Its real !?! WTF... I can easily do a blind AB and with 100% accuracy pick out 2 ethernet cables..
WTF... That makes NO SENSE... The music packets are read into a huge buffer in a serious DAC with a impressive reclocking..
So WHat the hell is going on ?
I jumped into this with $400,000 in rented test equip. I found ethernet jitter, all manner of noise. I found RF that was leaking into the rcv gear.. I then jumped inside the gear. I looked CLOSELY at the data lines feeding the DAC.. I found tiny changes in jitter coing into the DAC chip !?! WTF... STepping back in the circuit it was REALLY interesting.. The Ethernet rcvr chip spit out data that was a bit jittery. The CPU is of course interrupting to handle the incoming data from the ethernet stream. This jittery incoming data was causing some CPU jitter because it was, and had to be, intrupt driven. The same CPU also clocked out the data from the buffer and feeding it to the DAC chip. So while the overall clock was high accuracy and low phase noise, the bytes coming in had jitter because the CPU jittered that was induced by the ethernet jitter...
Hmmmmmmm..... OK... That makes sense... BUT does this affect cause the sound difference ?, I induced a LOT of jitter on the ethernet signal, YES it was audible, measurable and a easy blind AB.. BUT.... Subjectively, it sounded different.. It was NOT the whole affect.. I think the spectrum of the jitter also matters and I did not have a easy way to induce different spectra of jitter.
Also there were affects on the power supplies. If you look at the supplies for the ethernet chip you see noise.. This can be the same supply, poorly bypassed, that feeds other chips / sub systems. So I could see noise from the ethernet decoding on the power rail and this *might* get into other chips on this same rail as this stuff is always poorly filtered.
Moving to HDMI... Lots of the same thing.
The HDMI has a LOT of error correction. Like A LOT.. Bit by bit as it demodulates it causes very slight variations in decoding times. A error takes a tiny amount longer. So the hyper complex HDMI modulation has a LOT of jitter as its also splitting and then combining 3/4 lanes of modulation. Demodulating this and dropping this into a buffer again ends up with some amount of overall control jitter that you can see looking at the display driver chips on a Sony OLED. It turns out different HDMI cables cause affects all the way to the display drivers. Exactly how that occurs I dont know fully. But the HDMI data coming in really drives clocking of the whole system. I would suspect a surround decoder has this same issue as well. I have not measured a Datasat RS20i for this affect.
SO.. While YES,, the bytes are dropped into a buffer and read out.. The clocking of all that ends up being affected by the incoming signal in indirect ways..
OF COURSE... This is all my own research. So you can take it with a grain of salt if you like
.. All this research I did over 6 months gave me a good insight into my own products and what to do to make things better. Soooo, I am not out to change the world. I got what i needed. I have no need to "prove" this to the world at large with some kind of study. That would be a waste of time for me, I know whats up already.