Anyone who’s ever heard Apple’s proprietary spatial audio with dynamic head-tracking technology will tell you, quite forcefully, that movie content sent from an iPhone, iPad, Mac (with silicon) or Apple TV 4K to your AirPods Max , AirPods Pro or AirPods 3 is largely transformed.
Still to give it a whirl? You are in for a treat. Experience the opening scene of Gravitythe shooting in the latest James Bond epic, No Time to Dieor pretty much any scene Poison, ideally from an iPad Pro 12.9 to a set of AirPods Max. But we digress…
At its recent WWDC 2022 event, Apple released an update for spatial audio that comes with iOS 16. What’s new? This time, it’s personal.
The announcement was simple enough: Craig Federighi, Apple’s Senior Vice President of Software Engineering, said that with the release of iOS 16, users will be able to use an iPhone’s TrueDepth camera to create a ‘Custom Spatial Audio’ profile. .
So no one is asking you to go out and get impressions of your ears from an audiologist – your iPhone camera has that covered. In fact, it’s not the first time we’ve seen this kind of approach from headphone manufacturers.
The Sony Headphones App, for example, has been guiding users through photo shoot-style ear scans using their camera phones for some time now, across two generations of their best-selling WH-1000XM4 and WH-1000XM5 headphones. . And, of course, Sony has its own spatial audio format to make the best of it, Sony 360 Reality Audio…
How does spatial audio work now – and how will it get better?
Both the Sony 360 Reality and Apple’s head-tracked spatial audio use something called Head-Related Transfer Functions (HRTF). It’s a kind of formula for explaining the physical differences in any listener (ear shape, head shape, distance between the ears, if there’s anything really between them… OK, the last one is a joke) that will affect the listener’s reception. sound from a certain point plotted in space – Sony is very clear that their immersive solution works in a sphere around the listener.
By processing data obtained from thousands of individuals, an HRTF can be created that is closer to the perception and response of the average person – that is. Immersive, monitored spatial audio that will impress almost anyone.
So why the need to customize? Well, Sony already does that, although in mentioning the iPhone’s TrueDepth camera, Apple hints that its snapper can also do some 3D mapping of its channels, though the company didn’t explicitly say that.
It’s also unclear whether Apple’s custom spatial audio will involve a more robust fit test than the one currently featured on the AirPods Pro, which emits sounds and uses the in-ear microphone to test the effectiveness of the seal you’ve achieved between the ear canal. and the AirPod, all in an attempt to achieve the best possible audio quality and noise cancellation. Will the new customization process also involve a hearing test, as seen in products like the ultra-customizable NuraTrue headphones?
But – and this is a compliment, because Apple’s spatial audio is already excellent – will it really make it better? He remains to be seen; The iOS 16 beta is with a select group of testers now, the public beta will arrive in July and a full release is scheduled for late 2022, provided you have an iPhone 8 or later – and if you’re feeling brave, you can do this on your iPhone right now, although we’re not sure we recommend this course of action yet.
Opinion: Apple’s spatial audio will achieve great things – but not with this particular update
However, Apple ‘customizes’ its truly wonderful spatial audio with head tracking, I doubt it’s the upgrade we all really want to see – and more importantly, hear.
You see, spatial audio takes Dolby Atmos surround sound signals and adds directional audio filters on top, adjusting the frequencies and volume levels each of your ears can hear so sounds can be placed virtually anywhere. around your person. And when using Apple’s top-of-the-line AirPods and an Apple device with implanted head tracking, the device is also positioned and recognized as the source of the sound – watch any of the recommended movies at the beginning of this piece and simply have a look at a few steps from your device. Now turn slowly. To see?
What would be really mind-blowing as far as spatial audio is concerned is the ability to physically walk through a symphony orchestra, stopping next to the bassoonist or second violins perhaps. Currently, your device is still the source, so while It is immersive, you cannot collapse This one truly advanced level of customization; you can’t improve the eardrum by actually walking to it.
From an in-app perspective, the option to adjust crawled content from ‘device as source’ to ‘in situ’, if you have the space to live your virtual Sydney Opera House tour at your local community center, for example, would really improve the spatial audio. And that may be coming – but it’s not here yet.
As with all these advancements, this is when the technology is really advanced and malleable – when the end user is able to push it to the limit, break it and assemble it incorrectly, but in a way that they feel is an improvement. – that spatial audio will reach its full potential.
I’m not sure if taking a picture of your ear to optimize the spatial audio of the AirPods achieves this, but I don’t want to rain on the Apple parade either. It’s certainly a step in the right direction – and I’m really excited to see what this award-winning technology can achieve further. After all, we are so converted that we even selected 10 albums that we wish were available on Spatial Audio.