We do know that React Native is evolving very well in all areas. And recently, we’ve seen several enhancements in accessibility. Therefore I decided to write out the latest status of improving accessibility in React Native applications.
I will cover tools and techniques helping you as a developer to boost accessibility in your application with two separate articles. This first article is for setting up the developer tools. And the second article will be more about the techniques and code level introduction.
What is it all about?
As mobile developers, we know that the available viewport size is not as large as on desktop and web applications. Additionally, there are many restrictions and requirements in order to create intuitive user interface and fluent user experience. This is why we should really care about accessibility amongst UI and UX design.
Accessibility — also known as a11y — measures how the goals and intentions of and user are reached even when the user has disabilities or impairments when using the interface. For instance, all the elevator buttons should be low enough for a child or a person on a wheelchair to reach.
Since 0.59. React Native offers several properties to help with accessibility issues. I’ll cover them later in this article. But first, we need to have proper tools in order to be able to track down what is going on.
Here are the tools I recommend using:
- React Native Debugger > general debugger and inspector
- Reactotron > for handling app state and network requests
- iOS Simulator: Accessibility Inspector > tool for auditing accessibility
- iOS Device: VoiceOver > screen reader
- Android: Accessibility Scanner > tool for auditing accessibility
- Android: TalkBack > screen reader
How to Use These Tools
Most of the tools have good instructions and tutorials so feel free to deep dive in them. In this article, I’ll just scratch the surface.
React Native Debugger
If you’re familiar with any debugging tool (like browsers’ developer tools), RN Debugger is a very straight-forward tool to use. From an accessibility perspective, when an element is inspected, the debugger will show the structure of the application and properties of the element.
Reactotron is a common stand-alone tool not specified for developing accessibility features. However, it’s an excellent way to track the state of the application and network requests. For instance, if you offer settings for accessibility, it’s rather easy to track them by subscribing app state actions and changes.
XCode offers Accessibility Inspector as a stand-alone tool for, like the name says, inspecting accessibility. It gives insight for several properties of the elements in the user interface.
But the biggest caveat with React Native is that Accessibility Inspector is often capable only viewing the parent elements or high order component wrappers instead of a specific element. Therefore it may be unusable on some projects.
VoiceOver is a gesture-based screen reading software that allows the user to hear what is the label, role or the description of the element. It can be activated from Settings > Accessibility (on older iOS, it’s located under General).
It’s a bit unfortunate but VoiceOver can’t be used with iOS Simulator. In order to actually understand how different elements are treated by screen reading software, you will need to verify the actual outcome on iOS with a real device.
Accessibility Scanner is a tool for checking the state of accessibility and it gives you suggestions for improvements. It needs to be downloaded on the test device from Google Play and then enabled from Settings > Accessibility.
Basically, what it does, it scans the contents of the viewport. After that, it provides you a list of accessibility issues and improvement suggestions. Besides applications, it also allows you to scan the contents of websites in the mobile browser.
Provided suggestions contain very good arguments on what to improve and why. Usually, the fixes for better accessibility are rather easy to implement. So there’s no room for poor explanations why improvements should be delayed when accessibility is taken seriously.
TalkBack is a Google screen reader included in most Android devices. It’s very similar to Apple’s VoiceOver by giving you spoken feedback from interacted elements. It can be activated from Settings > Accessibility > TalkBack. Please note that you may need Android Accessibility Suite from Google Play.
The usage is that the first tap activates the screen reader and the second tap performs the actual action bound on the element. And to be honest, TalkBack can get easily very annoying and the user experience is sometimes rather slow on the emulator. In addition at least on an emulator, some of the actions can be unreachable. This leads switching back and forth with TalkBack feature.
In this part, we covered most common tools that help developers to work with accessibility. I’d personally say these tools are mature enough to be included in daily work and they serve well their purpose.
I know I’m repeating myself, but: there are no excuses to not implement accessibility feature in React Native applications. There are tools and there are rather easy code level techniques to include in your elements. And I’ll cover the code implementation in the next article.
Article featured image by: https://unsplash.com/@joshcala.