Improving Accessibility in React Native Applications – Part #1: Tools

Your application may need improvements with accessibility (photo by: https://unsplash.com/@joshcala)

We do know that React Native is evolving very well on all areas. And recently, we’ve seen several enhancements in accessibility. Therefore I decided to write out the latest status of improving accessibility in React Native applications.

I will cover tools and techniques helping you as a developer to boost accessibility in your application with two separate articles. This first article is for setting up the developer tools. And the second article will be more about the techniques and code level introduction.

What is it all about?

As mobile developers we know that the available viewport size is not as large as on desktop and web applications. Additionally there are many restrictions and requirements in order to create intuitive user interface and fluent user experience. This is why we should really care about accessibility amongst UI and UX design.

Accessibility — also known as a11y — measures how the goals and intentions of and user are reached even when the user has disabilities or impairments when using the interface. For instance, all the elevator buttons should be low enough for a child or a person on wheelchair to reach.

Accessibility Tools

Since 0.59. React Native offers several properties to help with accessibility issues. I’ll cover them later in this article. But first we need to have proper tools in order to be able to track down what is going on.

Here are the tools I recommend using:

How to Use these Tools

Most of the tools have good instructions and tutorials so feel free to deep dive in them. In this article I’ll just scratch the surface.

React Native Debugger

If you’re familiar with any debugging tool (like browsers’ developer tools), RN Debugger is very straight-forward tool to use. From accessibility perspective, when an element is inspected, debugger will show the structure of the application and properties of the element.

RN Debugger displays the properties of inspected element

Reactotron

Reactotron is a common stand-alone tool not specified for developing accessibility features. However, it’s an excellent way to track state of the application and network requests. For instance, if you offer settings for accessibility, it’s rather easy to track them by subscribing app state actions and changes.

Reactotron has plenty of features for debugging the application

Accessibility Inspector

XCode offers Accessibility Inspector as a stand-alone tool for, like the name says, inspecting accessibility. It gives insight for several properties of the elements in user interface.

But biggest caveat with React Native is that Accessibility Inspector is often capable only viewing the parent elements or high order component wrappers instead of specific element. Therefore it may be unusable on some projects.

Here Accessibility Inspector is only capable of accessing the parent container

VoiceOver

VoiceOver is a gesture based screen reading software that allows user to hear what is the label, role or the description of the element. It can be activated from Settings > Accessibility (on older iOS, it’s located under General).

It’s a bit unfortunate but VoiceOver can’t be used with iOS Simulator. In order to actually understand how different elements are treated by screen reading software, you will need to verify the actual outcome on iOS with a real device.

Accessibility Scanner

Accessibility Scanner is a tool for checking the state of accessibility and it gives you suggestions for improvements. It needs to be downloaded on the test device from Google Play and then enabled from Settings > Accessibility.

Basically what it does, it scans the contents of the viewport. After that it provides you a list of accessibility issues and improvement suggestions. Besides applications, it also allows you to scan contents of websites in the mobile browser.

Provided suggestions contain very good arguments on what to improve and why. Usually the fixes for better accessibility are rather easy to implement. So there’s no room for poor explanations why improvements should be delayed when accessibility is taken seriously.

Accessibility Scanner found one suggestion for Google’s homepage

TalkBack

TalkBack is a Google screen reader included in most Android devices. It’s very similar to Apple’s VoiceOver by giving you spoken feedback from interacted elements. It can be activated from Settings > Accessibility > TalkBack. Please note that you may need Android Accessibility Suite from Google Play.

The usage is that first tap activates the screen reader and second tap performs the actual action bound on the element. And to be honest, TalkBack can get easily very annoying and the user experience is sometimes rather slow on the emulator. In addition at least on emulator, some of the actions can be unreachable. This leads switching back and forth with TalkBack feature.

TalkBack highlights and reads out content and role or purpose of the element

Conclusion

In this part we covered most common tools that help developers to work with accessibility. I’d personally say these tools are mature enough to be included in daily work and they serve well their purpose.

I know I’m repeating myself, but: there are no excuses to not implement accessibility feature in React Native applications. There are tools and there are rather easy code level techniques to include in your elements. And I’ll cover the code implementation in the next article.