AppleInsider is supported by its audience and may earn commission as an Amazon Associate and affiliate partner on qualifying purchases. These affiliate partnerships do not influence our editorial content.
Apple has given a rare speech at a global AI gathering, with vice president Ge Yue choosing to concentrate on Machine Learning in accessibility features.
Ge Yue is an Apple vice president and also managing director of Apple Greater China, who has previously spoken about the company’s environmental efforts. Now she’s delivered a speech at the 2022 World Artificial Intelligence Conference in Shanghai, and instead of Apple’s other wide-ranging efforts in artificial intelligence, the company has chosen to illustrate the technology through accessibility features in Apple Watch, and AirPods Pro.
According to IThome, she said that “Machine Learning plays a crucial role” in Apple’s hope that its products “can help people innovate and create, and provide the support they need in their daily lives.”
“We believe that the best products in the world should meet everyone’s needs,” she continued. “Accessibility is one of our core values and an important part of all products. We are committed to manufacturing products that are truly suitable for everyone.”
“We know that machine learning can help disabled users provide independence and convenience,” she said, “including people with the visually impaired, the hearing impaired, people with physical and motor disabilities, and people with cognitive impairment.”
As an example, she touted the Assistive Touch feature of Apple Watch that “allows users with mobility difficulties on the upper limbs to control the Apple Watch through gestures.”
“This function combines machine learning on the device with data from the built-in sensors of Apple Watch to help detect subtle differences in muscle movement and tendon activity, thus replacing the display tapping,” she explained.
She also described the use of ML in door detection, using LiDAR scaners, plus “conversation enhancement on AirPods Pro [which] detects and amplifies sound through Machine Learning.”
Saying, too, that “our exploration in the field of health has just begun,” she says that Apple believes that “machine learning and sensor technology have unlimited potential in providing health insights and encouraging healthy lifestyles.”
At the same time, Ge Yue also stressed that “effectively protecting user privacy is always our top priority.”
The timing of the focus on accessibility and not something else that Apple has highlighted before at other conferences of this nature may not be coincidental. Apple is expected to debut new AirPods Pro at Wednesday’s “Far Out” event, alongside a refresh of the Apple Watch.