Personalized Spatial Audio in iOS 16 (adding HRTF to Airpods?)

QuadraphonicQuad

Help Support QuadraphonicQuad:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
I believe this is what replaced the mono to stereo, stereo to Atmos Demo that used Marvin Gaye's What's Goin On, I remember listening to it before. For fun I tried it with the new personalized feature. I always found Zane Lowe's voice intro weirdly echo-y pre "personalization" and now it sounds like he's coming from inside my head which is neat, and the "turn your head and it's like I'm next to you" works a bit better.

As with my earlier experiences, posted in this thread, the song feels much "closer" to me in space. The backing vocals are practically hovering above my shoulders, other than that, it's the same experience as listening to the Atmos mix for "good 4 u" by Oliva Rodrigo with the headtracking stuff, but Zane Lowe talks about the headtracking before and during the song.
 
Can anybody give a firsthand report about this?
I have AirPods Pro 1 and iOS16 and the spatialization is good. I would say they tend to trick your brain by telling you too quickly where the sound comes from. We tend to use our eyes to locate sound.

I have argued that the Apple Renderer is superior to the Atmos Renderer for binaural because Apple has dynamic head tracking, which Atmos does not seem to yet have.

Also the custom HRTF in iOS16 was a big improvement for me but they still have some work to do.
 
The addition of personalized spatial audio in ios 16, utilizing hrtf scanning for airpods, is an exciting development.
This advancement in binaural experience enhances immersion, and the competition between spatial audio and dolby atmos renders promises even better end user experiences.
Looking forward to its potential integration into logic pro for music production.
 
I went from Android to iOS recently - but still have my Samsung Buds Live (three years old, but with a special value for me). I still like the sound of them, so I wouldn't look for a replacement at this moment.
But I love Dolby Atmos, and I even enjoy the binaural rendering. I sounds quite nice from Apple Music with the Buds - but I'm curious how much localization would be improved with head tracking. (In old dummy head days it was already said that head tracking could help a lot.)
Would you think that this feature alone justifies buying AirPods Pro?

Thanks,
Martin
 
I went from Android to iOS recently - but still have my Samsung Buds Live (three years old, but with a special value for me). I still like the sound of them, so I wouldn't look for a replacement at this moment.
But I love Dolby Atmos, and I even enjoy the binaural rendering. I sounds quite nice from Apple Music with the Buds - but I'm curious how much localization would be improved with head tracking. (In old dummy head days it was already said that head tracking could help a lot.)
Would you think that this feature alone justifies buying AirPods Pro?

Thanks,
Martin
I’ve found the head tracking to be more of a gimmick than anything else. It attempts to “simulate” having a real Atmos system by having the elements stay in a fixed position when you turn your head. So if something is coming from the center channel it stays locked to the “center” even if you turn your head to the right. On paper it seems neat, but in practice I find it more annoying than anything, especially when out on a walk or doing chores at home.

Part of the issue is how long it takes to “reset” where the “front” is, so if you were to round the corner, the center would be stuck where it used to be for a good 5-7 seconds, and then you can kinda hear all the Atmos objects “snap” back into place with the new front position.

That being said, the head mapping I find quite useful. It re-scans your face, much like the FaceID scan, but then also has you keep the phone in the same position but rotate your head to scan your ears. When I’ve done that it’s a night and day difference between the stock Atmos Renderer and the “Personalized” one
 
Last edited:
I’ve found the head tracking to be more of a gimmick than anything else.
That being said, the head mapping I find quite useful. It re-scans your face, much like the FaceID scan, but then also has you keep the phone in the same position but rotate your head to scan your ears. When I’ve done that it’s a night and day difference between the stock Atmos Renderer and the “Personalized” one

Thanks a lot! I thought you'd improve the 3D image because of our small and inconscious movements. I've read about tests with dummy heads, where localization (especially from the front) became more precise. But from your description the tracking might not be precise and fast enough.

Anyway - the measurement option seems to be tempting.
 
Thanks a lot! I thought you'd improve the 3D image because of our small and inconscious movements. I've read about tests with dummy heads, where localization (especially from the front) became more precise. But from your description the tracking might not be precise and fast enough.

Anyway - the measurement option seems to be tempting.
I think the head tracking is useful when you are watching a video on your phone, it help localize the sound with the movie... If you are sitting, then it helps, but if you are moving, you are constantly changing directions and the system needs to redefine where the center is....
I expect the Apple Renderer to make progress in all these areas. Adding Head Mapping or HRTF was a big noticeable improvement.
You can read the release notes for the airpod firmware upgrades: About firmware updates for AirPods
 
I think the head tracking is useful when you are watching a video on your phone, it help localize the sound with the movie... If you are sitting, then it helps, but if you are moving, you are constantly changing directions and the system needs to redefine where the center is....
I expect the Apple Renderer to make progress in all these areas. Adding Head Mapping or HRTF was a big noticeable improvement.
You can read the release notes for the airpod firmware upgrades: About firmware updates for AirPods
Good point, forgot to mention that. For movies/TV the head tracking is pretty good, though (and I’m not 100% sure about this, so don’t quote me) I believe the front camera does assist with the head tracking in that case, which helps
 
I am hearing some reports that the firmware update that will roll out alongside iOS 18 tomorrow (Monday Sept 16th) improves upon this feature. Just a few rumblings on Reddit, so I can’t confirm. Seems like it’s out already for those on the beta, which I normally do, but I just couldn’t be bothered this year.

When I get them, I’ll do a new scan and report back.
 
I have the Airpods Pro 1 so no update for me.
I did a rescan for personalized audio... is it the placebo effect, but sounds a bit better...
I guess rescanning from time to time, help get better accuracy as the scanning app get better?
 
I have the Airpods Pro 1 so no update for me.
I did a rescan for personalized audio... is it the placebo effect, but sounds a bit better...
I guess rescanning from time to time, help get better accuracy as the scanning app get better?
Yeah, over here it sounds a touch better but I’m unsure.

I can confirm at the very least the method for scanning has changed. It used to be a standard front on view of your face than an awkward attempt to scan each of your ears. Now its a single step, you just turn your head alllllll the way to the left then allllll the way to the right.

Is it better? Feels like it… slightly… but also kinda hard to tell.

While I don’t know about the head scanning for Spatial Audio, I do know that face/head scanning for Face ID to unlock your phone is supposed to learn your face over time using Machine Learning (all on device, never going to the cloud)… it would make sense if the scan for personalized audio did the same
 
Back
Top