Earlier this week, Apple introduced a slew of accessibility features across its devices to help a wide variety of people gain more access. The features provide solutions for those with disabilities, whether they are visible disabilities or cognitive ones.  “It’s really exciting that people of all abilities and all strengths and needs will be able to use these devices,” Betsy Furler, the founder and CEO of For All Abilities, told Lifewire in a phone interview.  “It’s been an afterthought in the past, but Apple has done really a great job in being a leader in accessibility.”

Accessibility Across Devices 

Apple’s announcement included four major accessibility updates coming to its devices. Updates to VoiceOver, Apple’s screen reader for blind and low-vision communities, allows users to explore details about the people, text, table data, and other objects within images. Apple also added support for bi-directional hearing aids, so users who use them can have a hands-free phone and FaceTime conversations. However, Furler said she is most excited about Apple’s eye-tracking support coming to iPadOS. This feature will make it possible for people to control their iPad using just their eyes, alone.  “It’s something that people may not at first think is a big deal, but it’s literally a feature I’ve been waiting for for 10 years,” she said.  Furler said the iPadOS support of third-party eye-tracking devices would open up so many apps to people who previously could not have used them before, especially for people who cannot use their hands or are nonverbal.  Apple also introduced a feature that allows users to control their Apple Watch simply by making hand gestures rather than touching it. Furler said this feature is also a big deal, but not just for people with disabilities.  “Sometimes people think of accessibility features as someone with one arm or someone who doesn’t have any mobility, but really it also is going to impact people who don’t have the dexterity to touch one little area on the watch, like people who are aging or people with clumsy fingers,” she said.   AssistiveTouch for watchOS lets users control their Apple Watch by subtle differences in muscle movement and tendon activity, such as a pinch of the fingers or a clench of the hand. Apple said the impressive technology behind AssistiveTouch uses “built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning.” “The Apple Watch is such a convenient tool, and this feature is going to open up this device to so many different people who haven’t been able to use it before,” Furler said.  “By Apple making [accessibility] kind of cool and widely available, it becomes part of the norm and makes it typical.”

Tech Prioritizing Accessibility 

More and more tech companies, in general, have begun to prioritize accessibility features and make them more widely available in everyday devices. Last year, Google introduced a Look to Speak app that allows users to look left, right, or up to select what they want to say from a list of phrases, and the app speaks the phrases for them.  Other platforms adding more accessibility features include Instagram automatically adding captions on Stories with a simple sticker and Xbox adding speech-to-text and text-to-speech abilities to the Xbox Party Chat.  Furler said it’s amazing that accessibility is becoming more widely discussed and available in our everyday devices and the platforms we use regularly. “Accessibility, in general, brings so much acceptance to the fact that we’re all different,” she said.  “In the grand scheme of things, the more accessibility features we can have will bring our culture to a place where we understand we are all different.” She said she thinks the future of accessibility in tech will continue to cater to other disabilities besides just blindness or deafness. “For a long time, vision and hearing were really the focus of accessibility in terms of web or tech accessibility,” she said.  “I hope we’re going to keep on moving forward with companies thinking more and more outside of just vision and hearing and thinking about cognitive accessibility as well.”