Developer Portal

Accessibility in You.i Engine React Native

You.i Engine React Native allows app developers to support screen reader (text-to-speech) functionality using various RN props, methods, and modules. This topic covers the supported props, methods, modules, gesture support, screen reader settings, and best practices for creating accessible apps.

We support screen reader functionality for the following target platforms:

  • Amazon Fire TV (VoiceView)
  • Android Touch (Android TalkBack, Samsung Voice Assistant)
  • Android TV (Android TalkBack)
  • Apple iOS (VoiceOver)
  • Apple tvOS (VoiceOver)
  • Samsung Tizen (Voice Guide)

We provide some guidance for each of these platforms in this topic as well.

Refer to RNSampleApp, located at youiengine/<version>/samples/RNSampleApp, for an example of how accessibility is implemented with You.i Engine React Native. In general, to know more about the accessibility feature in React Native, see Facebook's Accessibility topic.

Before we begin, we should define two terms used in this topic: accessibility focus and navigation focus.

  • Accessibility focus is the focus you get on an element in your app when you tap or use screen reading through the accessibility feature, irrespective of what device the app is running on.
  • Navigation focus on 10ft devices allows users to navigate between different elements or screens in an app.

Props Supported

The following props are supported in You.i Engine React Native for accessibility:

Props Supported

For Which Components

Details

Facebook RN Docs

accessible

Button, ButtonRef, Image, Text, TextInput, TextInputRef, TextRef, all Touchable components, TouchableWithoutFeedback, View, and ViewRef

When true, indicates that the view is an accessibility element.

accessible

accessibilityRoles

TouchableWithoutFeedback, View, and ViewRef

Communicates the purpose of a component to the user of an assistive technology. You.i Engine React Native supports every role listed for accessibilityRoles.

 

accessibilityRoles

accessibilityStates1We found that the prop accessibilityState mentioned in the Facebook RN documentation should be accessibilityStates.

TouchableWithoutFeedback, View, and ViewRef

Describes the current state of a component to the user of an assistive technology. You.i Engine React Native supports every state listed for accessibilityStates.

accessibilityStates

accessibilityLabel

Button, ButtonRef, Image, ImageRef, Text, TextRef, TouchableWithoutFeedback, View, and ViewRef

A screen reader reads this string when a user selects the associated element. See also How accessibilityLabel Works below.

accessibilityLabel

accessibilityHint

Text, TextRef, TouchableWithoutFeedback, View, and ViewRef

Helps users understand what will happen when they perform an action on the accessibility element when that result is not clear from the accessibility label.

accessibilityHint

accessibilityActions

View and ViewRef

Allows an assistive technology to programmatically invoke the actions of a component. See also Methods Supported below.

accessibilityActions

 

Accessibility Prop Example

The following is an example of how to add accessibility props to your application. To see this example in context, see the Customize page of our Quick Start Guide.

<View style={styles.buttonContainer}
   focusable={true}
   accessible={true}
   accessibilityLabel="My button"
   accessibilityHint="Button in your first app"
   accessibilityRole="button"
 >

   <Button
     onPress={this.onMyButtonPress}
     title="My button"
     />
 </View>

 

How accessibilityLabel Works

A screen reader reads the accessibilityLabel string when a user selects the associated element.

  • If an accessibilityLabel is set for an accessible component, the label value is read. Note that if you pass a string whose contents are whitespace to the accessibilityLabel, the screen reader reads out the whitespace.
  • If an accessibilityLabel isn't set for an accessible component, the contents of the component are read.
  • If an accessibilityLabel is set to an empty string for an accessible component, the contents of the component are read. Facebook RN, by contrast, reads nothing in this situation.

Methods Supported

The following methods are supported in You.i Engine React Native for accessibility:

Method Supported

For Which Component

Notes

onAccessibilityAction

View and ViewRef

  • The following standard actions are supported:
    • activate
    • escape
    • magicTap
    • increment
    • decrement
  • Custom actions are not supported with onAccessibilityAction.

onAccessibilityTap

View and ViewRef

See onAccessibilityTap to learn more.

onAccessibilityEscape

View and ViewRef

See onAccessibilityEscape to learn more.

onMagicTap

View and ViewRef

See onMagicTap to learn more.

 

Support for the AccessibilityInfo Module

You.i Engine React Native supports the Facebook RN AccessibilityInfo module, which allows you to confirm whether the screen reader functionality on the device is currently active. To know more about what methods are supported by You.i Engine React Native, see AccessibilityInfo.

Enabling Accessibility in After Effects

Screen Reader

Screen reader functionality is supported for screen elements like buttons, scrolling lists, and static images created within an After Effects layout. Be aware that some elements like buttons and text inputs have default accessibility behaviors, while others don't. If you don’t specify accessibility attributes as described in this section, you may see the default behaviors in some screen elements in your apps, and none in others.

You can specify attributes for this functionality in After Effects by adding accessibility properties to your layer comments for each element before you export. All you have to do is type in the property for the associated attribute in the layer comment field.

For example, the following image shows a Back button with accessibility attributes set within layer comments in the After Effects timeline view.

Accessibility Attributes: Label, States, Role, Hint

These accessibility comments are React Native props that are picked up from After Effects when generating an RN or a C++ app:

  • accessibilityLabel
  • accessibilityHint
  • accessibilityStates
  • accessibilityRole

When a screen element is activated via a gesture from a user, screen reader capability in the device reads the attributes aloud in this order: label, states, role, hint.

Depending on the component you're creating in After Effects, you'll need to input a particular, preset accessibilityRole, such as button for a button. The same applies to accessibilityStates. You.i Engine One supports all of the roles available in React Native; accessibilityStates supported include disabled, checked, and expanded.

For the full list of accessibility roles, see Facebook’s RN documentation on accessibilityRole.

For hints and labels, consider the needs of the user and the design of the final app. Typically, the accessibilityHint tells a user the action performed by the element they’re activating. For example, a back button hint might be “navigates to the previous screen.” The accessibilityLabel performs a similar function, except that the label for a back button might be “go back.”

If a user requests the whole screen to be read one element at a time, both the accessibilityLabel and the accessibilityHint are read. However, if the entire screen is being read, attributes are read in this order: label, states, role (hint is not read). This might affect your decisions about what you want the label and the hint to be.

The tags associated with each designation, such as accessibilityLabel: “Back”, allow the screen reader feature to read that tag for each screen element in a view.

For more on how these props work, see Facebook's Accessibility documentation.

C++ Accessibility Attributes: CYIAccessibilityAttributes

Layer comments in After Effects work similarly for both C++ apps and React Native apps. For a C++ application, CYIAccessibilityAttributes is the class you'd interface with directly to enable screen reading functionality, whereas the React Native framework interfaces with that class for you when developing a React Native application. See our API documentation on CYIAccessibilityAttributes for more information.

Limitations

Generally, any screen element that you create in After Effects that has accessibility information listed in the layer comments for that element is read. One exception involves scrolling lists: when creating a scrolling list, the list must use a row or column arrangement if you want the list item's index to be read.

While React Native allows you to create variable strings that refer to other props, in After Effects the layer comments are static strings that can't reference other props. To handle this limitation, rather than inserting your comments in AE, use the Ref component associated with the screen element in AE, and set any applicable RN accessibility props within that Ref component.

Best Practices for Developing Accessible Apps

Follow these best practices for developing accessible apps using You.i Engine React Native:

General Best Practices

  • Accessibility attributes set for a component in RN app code will override accessibility attributes set for that component in AE.
  • Annotate your app code with appropriate labels or hints using accessibilityHint and accessibilityLabel to prevent the announcement of special characters. For example, in our Quick Start Guide we annotate You.i TV as you dot i TV.
    <Text
      style={[styles.bodyText, themeStyles.bodyText]}
      accessibilityLabel="https://developer dot you i dot tv"
      >
      https://developer.youi.tv
    </Text>
  • Provide as much information to the user as you can. For example, use accessibilityRole to announce the purpose of a component and accessibilityStates to announce a component's current state.
  • Use the AccessibilityInfo module that provides methods, such as setAccessibilityFocus and announceForAccessibility, to customize the accessibility features in your app.
  • You.i Engine React Native apps don't automatically read text embedded in images. Instead, create a text element in a View component and add an accessibilityLabel with any text in the image that you want announced.
  • Use a flag that can be tracked by the app so that hints or labels are read only at app launch.
  • If the screen reader is enabled at app launch, conditionally render a modal overlay to add a welcome message on app launch that includes specific hints.

Best Practices for Lists

  • Add accessibilityHint or accessibilityLabel to the items for a screen reader to announce the number of tiles in an RN list. For example, add the accessibilityLabel for the Popular Movies heading with the total count, or include the index and the total count for each list item. Note that because this is handled automatically for Android Touch apps, you need to add special handling for Android Touch. For example:
    import React from 'react';
    import { FlatList, NativeModules, Text, View } from 'react-native';
    
    const isAndroid = NativeModules.PlatformConstants.platform === 'android';
    
    class PopularMovies extends React.Component {
    
      render() {
        return (
          <>
            <Text accessibilityHint={(this.props.assets.length) + "Items."}>
              Popular Movies
            </Text>
            <FlatList
              renderItem={this.renderItem}
            />
          </>
        );
      }
    
      renderItem = ({ item, index }) => {
        // On Android, the item index is announced in addition to the provided hint, so we want to exclude that information from the hint.
        return (
          <View accessibilityHint={!isAndroid ? "Item " + (index + 1) + " of " + (this.props.assets.length) + "." : ""}>
            {/* etc */}
          </View>
        );
      }
    }

Focus Best Practices

If the screen reader is turned on after an app is launched, You.i Engine One automatically moves the accessibility focus to the currently focused item if it's accessible. If there's currently no focus, or if the focused item isn't accessible, You.i Engine One searches for an accessible focus item from the top-left corner, in left-to-right reading order. As a result, it's a good idea for your app to handle the state change when the screen reader is enabled by explicitly setting the accessibility focus to a specific location. You can use addEventListener and setAccessibilityFocus to listen for the screen reader state change and set the accessibility focus; see AccessibilityInfo in the Facebook RN docs.

Focus on 10ft Devices

  • Navigation focus allows users to navigate between different elements or screens in an app. Accessibility Focus follows the element that already has navigation focus.
  • If an element doesn't have navigation focus, a user can't navigate to it or interact with it, but the element will still be announced when the screen reader reads the entire screen. As a result, to ensure that text is read aloud, make sure it's part of a focusable view. For example, if a user is focused on an item in the settings page, there may be some information that is not selectable and cannot be navigated to. However, the screen reader will still read the information presented outside of the focus area.
  • You.i Engine One provides additional help for elements without navigation focus. If navigation focus moves into a subtree where there are other accessible elements to be read, the screen reader reads those elements when the focus first enters. For example, if you navigate between the lander lists in our RNSampleApp, the header "Popular Movies" is read when you first move focus down to another list, but not as you move focus within that same list. The text is read the first time the button gets focus, but not again until focus leaves and re-enters that view. This behavior, which differs from standard 10ft behavior, ensures that items are visible to the screen reader even if they're not focusable.

Note: Different target platforms may have additional focus behavior. See Target Platforms below for more information.

  • Use accessibilityHint on the relevant view for a screen reader to provide navigation hints.
  • Use accessibilityHint to configure hints based on the platform the app is running on.

Search Best Practices

Use accessibilityHint, accessibilityLabel, and accessibilityRoles to configure the screen reader for search results or search text.

Target Platforms

Amazon Fire TV

Before you start, make sure to enable VoiceView on Fire TV.

To know more about VoiceView on Fire TV, see Assistive Technologies for Fire OS.

Action Button Support for Fire TV

Type

Command

Action Button

Navigation

Move to next item

Click on right arrow/circle

Move to previous item

Click on left arrow/circle

Back out of the current content

Back button

Move to related content

Select button

Output

Read page starting at the top

Navigate to the top using the arrow keys

Interaction

Toggle current action

 

Show item of chooser

 

 

Fire TV VoiceView Settings Supported

Setting

Description

Verbosity

Choose your TalkBack speech verbosity. Select custom (the default setting) to adjust all speech preferences:

Keyboard echo

Speak usage hints

Speak lists and grid information

Speak the number of list items on the screen

Speak element type

Speak phonetic letters

Speak Passwords

Toggle on/off

Speech Volume

 
 
 

Sound Feedback

Toggle on/off

Sound Feedback Volume

 

Audio Ducking

Decrease other audio volume while speaking

 

Android Touch

You.i RN apps for the Android Touch platform support screen reader functionality through Android's TalkBack feature. As users navigate using TalkBack gestures, accessibility information is read aloud for the currently focused element. You.i RN apps also support a few additional gestures available with Samsung's Voice Assistant.

Supported Accessibility Actions on Android Touch

See also Navigate Your Device with TalkBack and Use TalkBack Gestures in the Android documentation, as well as Samsung Mobile Accessibility.

Category

Action

Default Gesture

Basic Navigation

Move to next item on screen

Swipe right

Move to previous item on screen

Swipe left

Cycle through navigation settings

Swipe up or down

Select focused item

Double-tap

Move to first item on screen

Up then down

Move to last item on screen

Down then up

Scrolling

Scroll forward (if you're on a page longer than one screen)

Right then left

Scroll back (if you're on a page longer than one screen)

Left then right

Sliders

Move slider up (such as volume)

Right then left. When a slider has focus, using the hardware increment volume key is supported.

Move slider down (such as volume)

Left then right. When a slider has focus, using the hardware decrement volume key is supported.

 

Output

Read specific element

Single-tap

Read from top of screen

TalkBack: From global context menu, or shake device

Voice Assistant: Three-finger tap, or shake device

Read from next item

TalkBack: From global context menu

Other

Home

TalkBack: Up then left

Voice Assistant: Four-finger double-tap

Back

TalkBack: Down then left

Voice Assistant: Four-finger tap

Overview

Left then up

Notifications

Right then down (or two-finger swipe down from top of screen)

Screen search

Left then down

Open local context menu

Up then right

Open global context menu

Down then right

Direct app interaction

 

Navigate in app as though Explore by Touch were disabled

Two-finger drag

Select element in app as though Explore by Touch were disabled

Two-finger tap

 

Supported Accessibility Settings on Android Touch

You.i RN apps for Android Touch support all TalkBack accessibility settings.

Android Touch Best Practices

For platforms other than Android Touch, we recommend adding accessibilityHint or accessibilityLabel to list items for a screen reader to announce the number of tiles in an RN list. If you do this for Android Touch, however, TalkBack reads the item's position twice. See Best Practices for Lists for an example of how to handle this.

Android TV

Before you start, make sure to enable TalkBack on Android TV.

To know more about TalkBack on Android TV, see Get Started on Android with TalkBack.

Action Button Support for Android TV

Type

Command

Action Button

Navigation

Move to next item

Click on right arrow/circle

Move to previous item

Click on left arrow/circle

Back out of the current content

Back button

Move to related content

Select button

Output

Read page starting at the top

Navigate to the top using the arrow keys

Interaction

Toggle current action

 

Show item of chooser

 

 

Android TV TalkBack Settings Supported

TalkBack Android TV Settings Supported

Verbosity

Choose your TalkBack speech verbosity. Select custom (the default setting) to adjust all speech preferences:

Keyboard echo

Speak usage hints

Speak lists and grid information

Speak the number of list items on the screen

Speak element type

Speak phonetic letters

Speak Passwords

Toggle on/off

Speech Volume

 

Sound Feedback

Toggle on/off

Sound Feedback Volume

 

Audio Ducking

Decrease other audio volume while speaking

 

Apple iOS

Before you start, make sure to enable VoiceOver on iOS.

Gesture Support for iOS

Type

Command

Gesture Supported

Interaction

Activate

One-finger double-tap

Escape

Two-finger scrub (Z motion)

Custom Action

Two-finger double-tap (Magic Tap)

Basic Navigation

Move to Next Item

One-finger swipe-right

Move to Previous Item

One-finger swipe-left

Other

Select an element under your finger

One-finger single-tap or one-finger drag across the screen

Scrolling

Scroll Down

Three-finger swipe-up

Scroll Left

Three-finger swipe-right

Scroll Right

Three-finger swipe-left

Scroll Up

Three-finger swipe-down

Output

Read from top to bottom

Two-finger swipe-up

Read page starting at selected item

Two-finger swipe-down

 

iOS VoiceOver Settings Supported

VoiceOver Settings Supported

Notes

Speaking Rate

 

Changes with the rotor setting.

Use Pitch

 

 

Verbosity

Emoji Suffix

 

Speech

Voice

 

Audio

Pronunciations

 

Add new languages (rotor languages)

Changes with the rotor setting.

Audio Ducking

 

Caption Panel

 

Only supported on iOS 13.

 

iOS Best Practices

For an iOS app, move the accessibility focus using setAccessibilityFocus after transitioning to a new screen.

Video Player Best Practices

Annotate your app code to include appropriate labels or hints using accessibilityHints or accessibilityLabel. The following table provides recommended guidelines for the app code on the gestures and appropriate actions related to those gestures:

Gestures

Recommended Actions for Video Player Controls

Swipe Up

Fast Forward by 15 seconds on iOS only

Swipe Down

Rewind Video by 15 seconds on iOS only

Tap on Scrubber

Select current position. Use one-finger swipe-up or swipe-down to adjust the value

Tap on Any Button

Select Play/Pause/Fast Forward 15 seconds/Rewind Video 15 seconds

Tap on Video

Select Video

Double Tap on Video

Toggle Video Player controls

 

iOS-specific Known Issues

We're aware of the following known issues in You.i Engine One apps when VoiceOver is enabled on an iOS device.

Keyboard

  • Tapping away from the keyboard once dismisses the keyboard.
  • It's not possible to begin a drag on the keyboard and continue that drag elsewhere on the screen.
  • When the iOS keyboard is active, two accessibility highlights are visible: one on the selected key, and one in the application.

Text Announcements

  • When the accessibility focus is moved to a non-accessible element in the app (for example, when the iOS keyboard is dismissed, or when a user taps on an iOS view that has no accessibility metadata), the app may begin to announce "Possible Text", followed by any information in the view that the iOS platform has identified as readable.
  • Some contextual information that's announced by native iOS apps, such as "is editing", is not announced for You.i Engine One iOS apps.
  • VoiceOver announcements such as "Items N to M of O" after a three-finger swipe aren't supported.

Gestures

  • Certain gestures, including those that move the cursor to the beginning or the end of the screen, aren't supported for You.i Engine One iOS apps. See the list of supported gestures.

Lists

  • When the two-finger swipe-up gesture is used to read the whole screen, the You.i Engine React Native behavior differs from native iOS behavior: screen reading starts at the first item in a fully visible list. If a list is only partially visible at the top of the screen, its contents aren't read.

Other

  • Two accessibility highlights are visible when the accessibility focus moves to another iOS view.
  • When the accessibility focus moves away from a text field, the text field becomes deactivated.
  • When working with Touchables (i.e. TouchableOpacity), if both onPress and onAccessibilityTap are defined as props, they will both activate if VoiceOver is enabled on iOS.

Apple tvOS

Before you start, make sure to enable VoiceOver on tvOS.

To know more about VoiceOver on tvOS, see Use VoiceOver on Apple TV.

 

Gesture Support for tvOS

Note

The VoiceOver rotor is a touch control that lets you choose options by rotating two fingers on the Touch surface of the Siri Remote. See Apple's VoiceOver Guide for more information.

  • Only two Rotor mechanisms are supported for tvOS: DirectTouch and Follow Focus. For more details, see VoiceOver rotors.
  • Depending on what rotor mechanism the user has selected, you’ll get different results. With Follow Focus, when you swipe to move focus, the focus moves by one item, but with DirectTouch, swiping moves focus across multiple items. For more details, see VoiceOver on your Apple TV.

Type

Command

Gesture

Rotor Mechanism

Navigation

Move to Next Item

One-finger swipe-right

Follow Focus or Direct Touch

Move to Previous Item

One-finger swipe-left

Follow Focus or Direct Touch

Move to Next Item Using Rotor Settings

One-finger swipe-down

Follow Focus or Direct Touch

Move to Previous Item Using Rotor Settings

One-finger swipe-up

Follow Focus or Direct Touch

Interaction

Select Previous Rotor Setting

Two-finger rotate-counterclockwise

Follow Focus or Direct Touch

Select Next Rotor Setting

Two-finger rotate-clockwise

Follow Focus or Direct Touch

 

Note

You.i Engine React Native does not currently support the VoiceOver Speak Hints feature.

tvOS VoiceOver Settings Supported

VoiceOver Settings Supported

Verbosity

Media description

Voice

 

Pronunciations

Phrase - type out how the phrase should be pronounced

Language

Ignore case (on/off)

Speech Rate

 

Use Pitch

 

Audio Channels

 

 

tvOS Best Practices

Annotate your app code to include appropriate labels or hints using accessibilityHints or accessibilityLabel. The following table provides recommended guidelines for the app code on the gestures and appropriate actions related to those gestures:

Gestures

Recommended Actions for Video Player Controls

Swipe Up

Navigate between Video Player UI controls on tvOS

Swipe Down

Navigate between Video Player UI controls on tvOS

Tap on Scrubber

Select current position. Use one-finger swipe-up or swipe-down to adjust the value

Tap on Any Button

Select Play/Pause/Fast Forward 15 seconds/Rewind Video 15 seconds

Tap on Video

Select Video

Double Tap on Video

Toggle Video Player controls

 

tvOS-specific Known Issues

  • The gesture for pausing and resuming speaking on tvOS (two-finger single tap) isn't currently supported.
  • Reading the entire screen is not supported.
  • When testing accessibility functions with the Siri Remote, be aware of an issue that occurs when using a real remote with tvOS Simulator: no events are fired for SiriRemoteClickUp, SiriRemoteClickDown, SiriRemoteClickLeft, or SiriRemoteClickRight. This issue does not occur when using the simulated remote. See also Working with the Input Module.

Samsung Tizen

On Tizen, the screen reader is automatically enabled and disabled when the Voice Guide is turned on and off in the Tizen TV settings menu. The error overlay that’s displayed when a Tizen app crashes during launch or at runtime will also automatically vocalize the screen contents when the Voice Guide is enabled. You.i RN apps respect Tizen platform settings that control the visual representation of subtitles; we integrate with Tizen’s TVInfo Digital Caption Options APIs to do so.

On 10-foot platforms like Tizen, You.i RN accessibility relies on our Navigation Focus system. Once the Voice Guide is active, users simply focus on screen elements by navigating using remote control input, and the accessibility information for the currently focused element is automatically read aloud. A You.i RN app provides the same accessibility user experience as an HTML5 Tizen web app.

Supported Tizen Accessibility Actions

You.i React Native supports starting and stopping the screen reader based on the element that currently has focus.

The mapping of accessibility actions to remote control input depends on the remote control model; consult Samsung’s documentation for details.

The following actions aren't supported:

  • pausing and resuming the screen reader
  • switching the screen reader voice
  • reading the entire screen (because it's unsupported by Tizen)

Supported Tizen Accessibility Settings

When Voice Guide is active, You.i RN apps for Tizen automatically respect the TV’s current accessibility settings.

Accessibility Setting

Supported?

Voice Guide: Volume

Yes

Voice Guide: Speed

Yes

Voice Guide: Pitch

Yes

Video Description

Yes

High Contrast

No

Enlarge

No

Captions

Yes

 

Support for Tizen Digital Caption Options

You.i RN’s custom subtitle rendering system integrates with Tizen TV’s digital caption options. When a user sets a Digital Caption Option on the TV, the change is reflected in your app. We support all of the Tizen TV Digital Caption Options except the following, due to issues with Tizen’s APIs:

  • Font Style (for example, Bold and Italic)
  • Foreground Opacity: Slightly Translucent
  • Foreground Opacity: Highly Translucent
  • Background Opacity: Slightly Translucent
  • Background Opacity: Highly Translucent
  • Window Opacity: Slightly Translucent
  • Window Opacity: Highly Translucent

In addition, completely transparent text (foreground opacity) is not supported due to limitations within HTML text rendering.

See also Supported Streaming Protocols, Timed Metadata Formats, DRM, Closed Captions, and Subtitles Features.

General Limitations and Known Issues

The following are the current limitations and known issues for using accessibility with You.i Engine React Native:

  • Facebook React Native doesn't support announcing the row or column details for the currently focused element, so You.i Engine React Native doesn't support this for RN lists. But you can add app-side logic to announce this information to users.
  • If you're using the speech synthesizer module to speak text and the screen reader is enabled on your device, you may hear an overlap of voices.
  • Appium tests don't currently work when the screen reader is enabled. Accessibility support with Appium is being evaluated for inclusion.