Apple’s AirPods are about to get much smarter with Adaptive

Norman Ray
Norman Ray

Global Courant

A “conversation awareness” feature will automatically activate transparency mode and lower the volume of music when the AirPods detect speech — much like earbuds from Sony and other brands already do.

Apple also demonstrated much better background noise reduction to prevent people on the other side of a call from hearing unwanted distractions. The WWDC AirPods segment featured a marching band surrounding a person wearing Apple’s second-gen AirPods Pro. To the person on the other side of that call, the band cacophony was barely perceptible.

Automatic device switching between Apple’s platforms should work faster and more reliably when using your AirPods, according to the WWDC presentation. These updates are due this fall alongside the release of iOS 17, iPadOS 17, and tvOS 17.

- Advertisement -

Some of these new AirPods voice features are similar in concept to Google’s Clear Calling. “Clear Calling uses machine learning to recognize, isolate, and eliminate distracting sounds like wind and traffic noise while enhancing your caller’s voice,” Google’s Brian Rakowski said during the company’s Pixel event last October. “So your friends can call you from anywhere, and you’ll be able to hear them just fine.” It’s been rumored that Clear Calling is headed to the Pixel Buds Pro in the not-too-distant future.

Apple’s AirPods are about to get much smarter with Adaptive

World News,Next Big Thing in Public Knowledg

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *