UserWay Logo UserWay Logo Mobile

As technology continues to evolve, new accessibility challenges also arise. The impact that mobile tech alone has had on almost everyone’s lives over the past decade is nothing short of extraordinary. And with the Internet of Things (IoT) just starting to gain steam, technology is about to touch even more parts of our world and daily lives.

Unfortunately, that also increases the danger that disabled users will be excluded from these new innovations due to a lack of planning around accessibility. Thankfully, there are also technologies being created that greatly assist disabled users.

Video Chatting

Although it’s been available on desktop computers for nearly twenty years, video chatting is still relatively new on mobile devices. The evolution of this technology allows for greater flexibility and increased possibilities for those who need to utilize it.

Video chat technology has made a profound impact on the hearing-impaired community. Its presence on mobile devices allows hearing-impaired users to communicate with each other in sign language and in real-time no matter where they are located.

Automatic Captions

Another example is artificial intelligence (AI). It is already being used to add closed captions to videos for hearing-impaired users. While the conversion from speech to text isn’t always flawless, it is a step up from videos without captions. Big platforms like YouTube are employing AI and voice recognition to generate closed-captions automatically. After a draft is generated, creators can then edit the draft to ensure accuracy before publishing them. It’s like a closed caption starter kit – reducing the work required by creators to transcribe each video manually.

It’s important to note that if you’re creating a video, you should never publish AI-generated closed captions without reviewing and editing them first. Even though AI is pretty good, it does tend to stumble over certain words and especially has trouble with acronyms. The resulting errors could be benign, funny, or even pretty embarrassing for you to put out there. You’ll never know unless you proofread them.

Still, as AI continues to develop, it will also improve in accuracy. We can look forward to even better automatic captions in the future; we just need to be patient and watch for errors until that day comes. In the future, it may even be possible for AI to become sophisticated enough to automatically generate an audio description of visual actions so that visually-impaired users can also enjoy video content.

Future Possibilities

Given the awesome potential power of AI, it’s difficult to imagine all of the other ways that it could improve the lives of users with disabilities. Still, we can speculate about some fairly intuitive ones. For example, browsers or screen readers that are built with an AI component can “learn” user-preferences over time.

These preferences could be general or site-specific, and the technology could adjust its behavior automatically to align with what the user prefers. AI-enhanced browsers and screen readers might also be able to learn how to cope with badly-coded sites or web accessibility features that are not fully supported by the user’s favorite browser.

All of this could add up to creating a much better and more seamless user experience.

Another emerging technology worth talking about is eye-tracking. This technology employs special cameras to track the movement of users’ eyes. It is also a great example of unintended positive consequences.

Initially, the technology was intended for marketers and advertisers to see where their audience was really looking when it came to marketing materials. As sometimes happens with exploring and creating new technology, eye tracking brought a “happy accident” with it, offering unexpected but very desirable uses beyond its original purpose.

Users with a mobility impairment who are unable to use a mouse or a touchscreen can face serious challenges when navigating the web and eye tracking can help such users interact with the web just by looking, and blinking, at the desired elements.

Even though modifications exist like keyboards, foot pedals, and sip-and-puff mechanisms, the process of using those can be slow and exhausting. With eye-tracking, a user could navigate the digital world just by moving their eyes. Additionally, a vocabulary of eye gestures is currently being constructed to make online navigation more straightforward and more comfortable for users with disabilities (or perhaps those without them as well).

Change Is Happening

There are also more web-centric innovations emerging to help disabled users. The UserWay widget, for example, offers many tools to enhance the accessibility of any website for users with various disabilities, including dyslexia, low vision, and color blindness.

Because UserWay’s Accessibility Widget does not require substantial re-coding of a website, it provides an immediate accessibility boost for users. Hopefully, as time goes on, we will witness more advanced technologies with built-in accessibility tools to help users customize their experience and ensure a fully inclusive experience for all audiences.