You may be right, I'd need to revisit all the figuring. I have a pet peeve about any expression in 'bits per second' with these (relatively) large file and bandwidth numbers, I just find bytes easier to get my mind around. There is some argument that when talking about bandwidth vs file sizes that 'bits' make more sense but I can't remember how it goes and also suspect it's rooted in some reasons from the past that may not make sense anymore.
FWIW: @AYanguas corrected @dabl's bit/Byte slip just a couple of posts later in the original thread.
I want to say that someone--@bracelis, maybe (who hasn't logged in for a year), or @pat bateman?--once posted a very thorough and cogent explanation of the bandwidth issue and its potential implications for fidelity and resolution. My takeaway from it--possibly wrong--was that the streaming services are indeed scrimping on bandwidth when it comes to Atmos, presumably on a theory that's analogous to what we know about mp3. That is: much of the time, for many kinds of music, 768kbps is actually plenty of bandwidth, even for 7.1.4 channels. It's relatively rare that the density of musical information in any given channel (or combination of channels) will require more than a total of 768kbps to transmit the signal with the requisite clarity, dynamics, and frequency response. But when it does require more than 768k, then of course those channels carrying the more complex parts of the signal will get compressed, and that's when you lose fidelity. I reckon Apple and Tidal and Amazon figure, cynically, that since most people listening to Atmos on their services are using headphones or soundbars or Echo speakers, it doesn't matter. But I still find it hard to believe that scrimping in that way really amounts to a significant savings on their end. Still hoping that record companies have been delivering their files in high-res all along and Apple will start streaming them that way with a new OS.
Last edited: