TIDAL use 768 kb/s bit rate for the DD+ with Atmos streaming. So, no possible improvement here, while still using DD+. I asume this would be the max average bit rate allowed by Dolby codec.
I do not know what Apple Music use.
There are audio quality differences between lossy (streaming DD+ JOC) and lossless (Dolby TrueHD with Atmos on Blu-ray).
The differences are clearly noticed by many of us that compare an Atmos streaming album with the corresponding Blu-Ray edition.
I wonder, how much this audible difference is due to the codec format (lossy vs lossless), and how much could be due to a
different mastering process to let the Hi-res edition clearly sound better.
With respect to the use of DD+ in streaming services, I have read somewhere that the choice is a technical constraint because the High bit rate could not be adequately transferred through the available bandwidth of the average Internet user.
I do not quite understand this because:
- The bitrate of a Dolby TrueHD with Atmos is 640/760 kb/s, variable rate with a maximum peak of about 7 or 8 Mb/s, according to MediaInfo in some Blu-rays of my own. The Netflix user for UHD needs up to 20Mb/s to watch it at 4K.
- If a user internet link drops below that bandwith, it is normal that the video streaming playing device buffers or switch to lower resolution (HD or less) to cope with the lower bitrate. We notice a temporary degradation of the video quality that improves automatically later. I notice much of this with Amazon prime video.
- BUT, what about, music streaming with Dolby TrueHD Atmos? Possibly the adaptive decoders to lower the resolution/sound quality on the fly does not exist.
I wonder if this is currently a technical constraint or just another market segmentation to sell the Hi-res better quality Blu-ray apart from the “normal quality” streaming.