[ad_1]

Apple has announced a handful of new accessibility features set to arrive later this year, including “Eye Tracking,” which will allow users to control their iPhones or iPads with just their eyes.

According to Apple, the new Eye Tracking feature is powered by artificial intelligence and is designed for users with physical disabilities. The technology utilizes the device’s front-facing camera to set up and calibrate within seconds.

“With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes,” reads Apple’s official statement on the new accessibility features. The company promises that the data secured by the eye scan isn’t shared with Apple and is kept solely on the device.

Another new feature coming to Apple devices is Music Haptics, a new way for users who are deaf or hard of hearing to listen to music on the iPhone. When turned on, the phone’s Taptic Engine will play taps, textures, and refined vibrations to songs playing through Apple Music.

There are several more accessibility features coming to the iPhone, including Vocal Shortcuts for users with atypical speech, Vehicle Motion Cues to reduce motion sickness for passengers in moving vehicles, Voice Control for Apple’s CarPlay feature, and systemwide Live Captions that can be used on FaceTime calls and various other apps. Read more about the accessibility updates headed to iPhones and other Apple devices later this year here.

While these new accessibility features are undoubtedly great for user experience, Apple has been catching some flak lately for their iPad Pro commercial that depicted vintage instruments being destroyed, which they then apologized for. Meanwhile, Apple has also been the subject of an antitrust lawsuit from the Department of Justice.



[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *