A peek into the future of technology and accessibility
As accessibility gains more traction in businesses, we need to look at the fundamentals. What does it mean to be accessible? How can technology advance to break barriers and make the world accessible?
I've heard many businesses in both small and large legacy corporations that "accessibility is a small percentage of our users and not a priority at the moment." This is a very narrow-minded approach especially when you take into account the broad definition of accessibility. The problem is, when accessibility is placed on the back-burner, it is tough to prioritize again, often costing companies multiple times more than if it were implemented and thought about from the beginning.
The definition of accessibility:
The quality of being able to be reached, accessed, or being easy to obtain or use.
When a product is being designed and the topic of accessibility is mentioned, more often than not sight is the first thing that comes to mind, often taking the shape of color blindness or total vision loss.
Take a moment to think about accessibility and all the implications it has. You may realize that there are many more issues ranging from motor-skills, cognitive issues like dyslexia, mental disorders ranging from anxiety to Alzheimer's, and so on.
Accessibility not only takes into count physical or mental capabilities as mentioned above, but there may be external forces too. Some examples of this would be unstable internet, income level, access to food, transportation, healthcare, and so on. When designing products, whether it be a physical good or digital, all of these considerations need to be taken into account to realize the impact.
Unfortunately, we can't solve all of the problems of the world in a single article, and writing best practices for each type of accessibility issue can be an entire series. For now, let's focus on the three senses most products use and what most people think of when discussing accessibility. These senses are sight, hearing, and touch. You may find throughout this article that there are many perspectives and ideas to elevate disabled people's heightened senses that may spark some inspiration. For the remaining of this article, we will also go on the notion that the majority of people with disabilities will have no problem utilizing at least two of these senses, as the diagram below demonstrates.
- A product that is only accessible via touch and visual makes your product accessible to deaf users but inaccessible to blind users.
- A product that is only accessible via audio and touch with no visuals is inaccessible to the deaf but may be accessible to the blind.
- In the (currently rare) case that a product is only accessible via visual and audio, but doesn't have any touch interactions will likely be accessible to most users.
- If a product uses all three senses, then the product will be accessible to everyone except for a smaller percentage of users which we are not covering in this article.
- Finally, if a product only utilizes one sense, it is even less accessible than those with just two interaction behaviors. Using only one method is highly discouraged, especially if the product adopted is a necessity in everyday life.
In this article, I will discuss the details of auditory and visual disabilities while exploring some concept technologies I've come up with that would benefit not just those with disabilities, but anyone that uses the product.
Disabling hearing doesn't always mean total deafness, it includes those that may need a hearing aid — however, the hearing aid may not prove effective. How can we communicate seamlessly with the hearing impaired through our products and even in peer to peer situations? We will be exploring these ideas in a bit.
Did you know there are approximately 466 million people in the world with disabling hearing loss?¹
Many products today are visual, so utilizing sound is not as necessary yet — but as we move into new conversational technologies, those with hearing disabilities are being left behind. The world is looking towards devices and communications where all you need is voice communication.
Assistive AI such as SIRI and Cortana and smart home devices like Alexa and Google Home. These technologies have started to pick up momentum in the consumer space as they establish themselves on a more useful level rather than just a gimmick. Unfortunately, products like Amazon's Alexa and Google Home are only utilizing audio with no visual feedback making them inaccessible to people with hearing disabilities.
If a product wants to take the world by storm, it needs to cover all three major senses. This won't Just benefit those disabled users but will allow anyone else to adopt and use the product in a way more comfortable to them. If there is one thing I have learned from user testing, it's that a single product has many more ways of getting the same task done than intended by the designer.
A sound perspective
Let's look at deafness from a different perspective, a perspective that appears to be advantageous to those hard of hearing, perhaps even more advantageous than those that can hear, especially as technology evolves.
There are many deaf people that use vibration to hear. For example, Don Grushkin a deaf instructor teaching language at the University of Sacramento has mentioned in a Quora post²:
We definitely do feel vibrations. In fact, this is an important source of information about our environment, especially the auditory environment, second to our eyes…We sometimes hold onto balloons at concerts to help pick up the vibrations of the music.
Grushken goes on to mention that while driving, a large truck passing by or a car playing music creates vibrations in a way that he will know of these oncoming obstacles. Due to the vibration fluctuations of music compared with a more vibratory consistency like a truck driving by, it is possible to infer the difference between the two different sources of sound.
Those with auditory disabilities are able to enhance their other senses to hear better than those of us who can hear normally. Instead of looking at the weaknesses of those that are disabled, let's continue on by looking at the strengths and how we can build on them to communicate clearly.
Bone conduction headsets are not a new technology, but I was surprised that the hearing aids don't use the technology. Bone conduction uses vibration to send audio from your jaw directly to your inner-ear, this bypasses the ear canal and translates to audio via your inner ear. A quote from Android Central³ on how these work states:
Since they don't depend on the eardrum, they can be great for people with hearing deficiencies, and since they aren't in or over your ear, you'll be able to hear what's happening around you. They aren't very good at reproducing sound that's true-to-life because of their design, but sometimes sound quality isn't the most important feature in a pair of headphones.
Interestingly enough, from my research, I haven't found any evidence that medical-grade hearing aids are using this technology on a wide-spread level. Hearing aids work by sending sound directly into the inner ear — in hopes to have sound heard better by the hearing impaired. You could consider this method as a middle ground between air and bone conduction.
So why aren't hearing aids using bone conduction technology? Well, they are, but not to the level as expected. There are some technological advances happening such as in Japan by a group of researchers from the Daiichi Institute of Technology in Kishirima.⁴ They conceived a technology that will use vibration to help those with hearing impairment in one ear to improve the perception of the sound direction and orientation. This new development is in the right direction even if it is for people that still have some level of hearing.
Food for thought
This next section may seem a bit farfetched, and I'm not a linguistics major so bare with me.
After doing some research on vibrations, and understanding that other senses are heightened, I started to brainstorm some ideas. If technology proved capable, a language could be developed using vibration. A language that translates speech to vibrational patterns to allow for the deaf to listen in on the world around them. A language similar to Morse code but instead of short and long auditory beeps this would be translated to vibration.
If a technology like this were to be created, I can only imagine it being translated on a universal level across languages. Even people with hearing abilities could learn this language and use this technology to communicate with people from different regions around the world without using a translation app or hiring a translator. Just imagine a universal language in vibration which allows anyone to naturally speak to any other person of another language. I mention this universal language now because unfortunately, we missed this opportunity with both braille and sign language.
Despite the above exploration, we do have a very primitive version of this language that Apple has already created with its launch of the Apple watch. Using the GPS feature, a user can be notified of the direction they need to turn next from different vibration patterns. After a few times feeling these vibratory patterns it becomes second nature in understanding what is being said.
If I am correct on my research of the Apple Watch haptic feedback, it utilizes two different points, the left and right side of your wrist (or the top and bottom edge of the watch). If the Apple Watch were to have haptic feedback on all four corners of the watch and integrate different length and intensity levels, the combinations are really limitless.
Below is a diagram I created that is visually showing how I envision haptic feedback on a product like an Apple Watch or similar devices to translate speech to a vibration-based language. In this case, the imaginary phrase being said translates to haptic feedback in three of the corners, with multiple pulses in each corner of varying strength.
In the diagram, the diameter of the ring indicates when the feedback was sent, and the thickness is correlated to the intensity of the vibration. The example below could be an example of someone asking "How are you?"
Each rings diameter displayed in the diagram represents how long ago the haptic feedback appeared, and the border thickness displays the strength of the haptic feedback.
Vision is arguably one of the most disabling of disabilities, as sight is required for nearly everything to lead a normal life as we know it. The World Health Organization has said that an estimated 2.2 billion people have some sort of vision impairment, while one billion have moderate to severe blindness.⁵
The absence of necessary accessibility
Take a moment to think about products that are not accessible to blind users. Look around your house or office, how many products are inaccessible, or require some level of guidance? Go ahead and ask yourself, if you blindfold yourself can you still use the product? Even after using this product for days, weeks, or years, can you confidently use it blind-folded? You may find many products, even essential for daily life are not possible to use without sight.
Microwaves, ovens, stove-tops and many other home appliances have flat-panel buttons and no distinct audible feedback. Today I went to use my microwave which has an entirely flat button panel, but lots of functionality, none of which have audio feedback. Tell me how can a blind user living alone confidently input a one-minute timer and not four? Or hit the defrost and not the bake (for those microwave ovens), most likely not. Many new appliances have the fancy touch panel keys and when I was reading through the manual of some appliances online, I found that there are no settings to enable audible speech feedback. For something so essential for living and to the visually impaired, this absence of accessibility is absolutely absurd.
Many may argue that appliances now connect to smartphones which have the accessibility features and feedback required. While having a companion application does have the side effect of making appliances more accessible, it is still creating unnecessary steps for those with vision impairment to use the product with ease. With older stoves at least the knobs were obvious and the heat level with gas can be felt clearly even for a blind user. If we want to advanced our products with the technology of today, then an oven should have voice assistance baked in.
A visually impaired person has to use audio to navigate a product or website — even when they are using a keyboard, audio feedback is given to them to confirm their actions and movement. The problem with audio feedback is that it's often over descriptive, especially when it comes to screen-readers.
If you've ever heard a screen-reader or tried one on your own you may find that a lot of the information is unnecessary and creates mental overload. There is software out there like "JAWS" to help ease up this situation, but navigating complex websites can still become overwhelming. It takes careful planning to ensure the success of a voice assistant. Finding the right balance of too descriptive or not descriptive enough, being a tool of all trades but master of none, we've seen it in many voice assistance where they try to do everything to the point that they are pretty useless in nearly everything.
Now you have probably spotted the flaw in my Venn-diagram by now. I earlier stated that if an interface has touch and audio feedback, it is accessible to blind users. The edge case, which is more common than we realize, is when the touch panel is entirely flat similar to the stove above. What were to happen if the blind person were off target by just a few millimeters causing the wrong temperature or failing to turn off the stove? It's definitely not what we hope anybody has to endure due to poor product planning.
Unless there is some clear tactile feedback or very clearly stated audio feedback, even if your product interacts both with audio and touch senses, it may still be inaccessible or mentally overwhelming for our blind members.
Power-user vs blind user
There is another user persona for digital products that are related to sight-impaired users: power users. There are many similarities of how a power-user and vision-impaired user will use digital software on their keyboard. For example, both will use a keyboard, and in my experience, many power users may find software frustrating to use if the keyboard doesn't work properly. In this way, how can we find a balance between these two personas such that the keyboard can be used by both, in the same way, the only difference being a toggle off or on for voice feedback?
When we think of augmented reality, vision-impaired people using this technology probably don't come to mind. On the contrary, I find this technology to be a perfect accessibility tool for our blind community members.
Augmented reality paired with audio feedback can serve as the eyes for a blind person, read signs, alert them of obstacles and more. If a blind person were to use augmented reality, it can really elevate the level of perception and bring a stronger awareness to their surroundings.
Without AR (Augmented Reality), visually impaired people may only understand what’s directly in front of them unless they are familiar with the environment.
With AR tied with audio feedback for greater angles of awareness allows the user to take in more of their environment from a wider angle.
In the diagrams shown above, we can imagine that someone with augmented reality can take in the environment up to 120° angle of viewing, whereas previously it's only from memory or where their walking cane directs them. In fact, with audio and other sensors, one may have a 360° awareness. This method can allow the user to learn about new places and locations, what's happening around them, how crowded the area is, obstacles and so on. Augmented reality could be similar to an audio tour of a museum, except this is for the entire city.
The hype around augmented reality is adding a new dimension to our already advertisement and our overloaded world but this new medium can and should be adopted to help those with disabilities first.
Google Labs has a lot of interesting products. Project Soli has always been one of my favorite technologies they have invented because it really helps separate hardware from our daily lives. Although technology lives on and advances, the idea of not having to look at my watch, grab a remote, or even remove buttons entirely from some interfaces is truly inspiring.
Of course, the other aspect of this technology is the fact that gestures can be learned easily and without issue for the visually impaired. Going back to the microwave and appliances example — imagine your appliances understand some mode of sign language via a radar chip like Project Soli, and the appliance gives audio confirmation of the objective.
Radar technology such as Soli is another great way to seamlessly integrate interactions for blind users while still maintaining the aesthetics of a clean interface without requiring tactile feedback. In a way, you could consider this another form of augmented reality. This technology can be found inside of Google's Pixel 4 and I hope they plan to expand this technology further.
There are many emerging technologies that are first looked at for the masses before even considering the disabled users. I believe some of these technologies such as audio tours for a city utilizing a simple augmented reality camera (or even a first iteration utilizing only the GPS of their devices) can greatly increase the mobility and discoverability of new places for disabled members of society.
When a new technology surfaces we need to ask ourselves if this betters the life and becomes essential in our daily lives, how does it affect those around us who are less fortunate (in all ways from disabilities to monetary, etc.)? And how can we repurpose even 5% of the product to address this issue?