December 4, 2022

Recently, Ge Yue, Apple’s vice president and managing director of Greater China, delivered a speech at the “Plenary Session – Industrial Development” of the World Artificial Intelligence Conference.

Ge Yue: Apple will use machine learning to create a healthier, more accessible future

In his speech, Ge Yue mentioned that machine learning plays a key role in people’s daily life. It can better play the functions of Apple’s software and hardware combination and improve people’s lives in all aspects. Features like Apple’s, powered by machine learning, could help people from all walks of life lead healthier lives.

Machine learning can help provide independence and convenience for users with disabilities, including the visually impaired, the hearing impaired, those with physical and motor impairments, and those with cognitive impairments. Like on the Apple Watch, AirPods Pro, Assistive Touch allows users with limited upper limb mobility to control the Apple Watch through gestures. Conversation Enhancement on AirPods Pro uses machine learning to detect and amplify sounds so users can hear them more clearly.

Ge Yue: Apple will use machine learning to create a healthier, more accessible future

Ge Yue also mentioned at the end that although Apple’s exploration in the field of health has just begun, it has already seen that machine learning and sensor technology have unlimited potential in providing health insights and encouraging healthy lifestyles. All of these features are designed to help create a healthier and more accessible future for all.

Below is the full text of the speech:

“Leaders and distinguished guests: Good afternoon, everyone. First of all, I am very grateful to the organizer for inviting me back to the World Artificial Intelligence Conference again. I am very happy to have the opportunity to communicate with you today.

At Apple, we want our products to help people innovate and create, providing the support people need in their daily lives. Machine learning plays a vital role here: it can better harness the power of our combination of hardware and software to improve people’s lives in every way. We’ve seen its enormous capabilities countless times.

Today, I want to dive into two areas where the potential to improve people’s lives is particularly clear: accessibility and wellness. We’ll explore some of Apple’s features that machine learning helps enable: some are designed for people with disabilities and special needs, while others help people from all walks of life lead healthier lives .

But, like any technology, machine learning doesn’t work alone, so I’ll start with the innovations that have made it such a powerful tool.

See also  Verizon ready to offer Apple One service for free after Apple’s iPhone 14 launch

At Apple, we always focus on the integrity of product design. Therefore, whether it is the hardware or software of our products, we have always believed that design and integration should go hand in hand.

And a great example of this integration is Apple Silicon, which helps enable some powerful new features through strong performance and excellent battery life. Neural network engines are a key part of these innovations. It is built for machine learning and is very powerful and efficient when running machine learning models.

Of course, our cutting-edge machine learning models don’t just rely on powerful chips, they also require high-quality input, including touch, motion, sound, and vision. We integrate powerful sensors into our devices. These sensors can provide fast and highly accurate signals to our machine learning models.

Combining these sensors, state-of-the-art machine learning models, and the power of Apple Silicon, we have designed features that can run entirely on end devices. Every function can run on our tailor-made hardware to maximize efficiency and get the best performance without consuming too much power. Since a high-speed network connection is not required, the performance of the function is more stable and reliable.

Most importantly, because no data needs to leave the end device, privacy is better protected. This advantage is especially important for health and accessibility features. Because for these functions, a great user experience is inseparable from efficiency, reliability and privacy protection.

Let’s start with accessibility features that provide accessibility assistance. We believe that the best products in the world should meet everyone’s needs. Accessibility is one of our core values ​​and an important part of all our products. We are committed to making products that truly work for everyone.

We know that machine learning can help provide independence and convenience for users with disabilities, including visually impaired, hearing impaired, those with physical and motor impairments, and those with cognitive impairments. Let’s look at a few examples, starting with the Apple Watch. Assistive Touch on Apple Watch allows users with limited mobility to control Apple Watch through gestures.

Instead of tapping the display, the feature combines on-device machine learning with data from the Apple Watch’s built-in sensors to help detect subtle differences in muscle movement and tendon activity. By including a gyroscope, accelerometer, and optical heart rate sensor, users can control the Apple Watch with hand movements such as pinching or making a fist.

See also  A new work of art every day on the iPhone – that’s how it works!

Next let’s talk about AirPods Pro. AirPods Pro combines Apple’s H1 chip with a built-in microphone to deliver a powerful listening experience through machine learning. Conversation Enhancement on AirPods Pro uses machine learning to detect and amplify sounds. If you’re talking to someone in a noisy restaurant, Conversation Enhancement can focus the voice of the person in front of you so you can hear it more clearly. It is worth mentioning that this function only needs to be run on the terminal device.

Finally, let’s take a look at the door detection feature that was recently released in iOS 16 and iPadOS 16. Door detection combines LiDAR scanners, cameras and machine learning running on end devices to help visually impaired users locate doors, detect the distance between people and doors, and determine whether doors are open or closed. It can even read signs and symbols around doors, such as room numbers for offices, or signs for accessible entrances.

Users can also use door detection in combination with people detection to help visually impaired people move more freely in public spaces, identify people nearby, and maintain social distancing. These are just a few examples of how machine learning can have a substantial impact on the lives of people with disabilities. Combining advances in chips, sensor technology, and end-device-based machine learning makes our products easier to use and helps users better interact with the outside world.

Health is another area of ​​our focus with potential to improve people’s lives. Technology can play an important role in making our bodies healthier and encouraging people to live healthier lives.

Our machine learning and sensor technology can provide useful health information, allowing users to gradually achieve overall health through small changes in daily behavior. We always ensure that these health features stand up to rigorous scientific validation, and protecting user privacy is always our top priority.

For example, watchOS 9, which we’ll be launching this fall, includes a new “Sleep Stages” feature. It can help everyone better understand their sleep status. Apple Watch has a heart rate sensor and accelerometer built into it. Using signals from these sensors, the Apple Watch can detect whether the user is in REM, core or deep sleep and provide detailed metrics such as sleep breathing rate and heart rate.

See also  Patent shows Apple may use glass to make MacBooks that may support wireless charging

In addition, fall detection uses the accelerometer and gyroscope in the Apple Watch, and through machine learning algorithms, it can identify serious falls. By analyzing the wrist trajectory and impact acceleration, the Apple Watch can send an alert if the user falls. Of course, we also want to go a step further and find a way to support and help users before they fall.

For this purpose we created the feature of walking stability. It’s a first-of-its-kind health feature that uses data generated by the iPhone as users move around to see how risky they are for a fall. This important information can help people gradually improve their mobility, thereby reducing the risk of falls.

While our exploration in health is just beginning, we are already seeing tremendous potential for machine learning and sensor technology to provide health insights, encourage healthy lifestyles, and more. All of these features serve our mission of creating a better life. We are hopeful about the future of machine learning: we strongly believe that it can inspire more innovations that improve people’s lives.

It can help us understand our physical condition and develop healthier living habits. It can lower the barriers to use of technology and bring the outside world closer. It also protects our privacy and gives us confidence in technology.

At Apple, we strive to innovate, empower users, and make technology a force that improves people’s lives. We are excited to continue our journey up and down this path, using machine learning to help create a healthier and more accessible future for all. thank you all! “

Special statement: The copyright of this article belongs to the author of the article, which only represents the author’s point of view and does not represent the opinion and position of Aisi Assistant. This article is uploaded by third-party users and is only used for learning and communication, not for commercial purposes. If the content, pictures, audio, video, etc. in the article have third-party intellectual property rights, please contact us in time to delete.