Developer Portal

You.i Engine One: Frequently Asked Questions

Technical FAQs

Since announcing You.i Engine One’s compatibility with React Native, the response has been quite positive. The release of 5.0 of You.i Engine One is the first to officially contain React Native as a fully-integrated part of the product.

While this presents a great new way of developing with You.i Engine One, it also raises a lot of questions. Below, we've gathered some of the most common inquiries we receive, and attempt to provide some clarity on the inner workings of You.i Engine One, including how it connects to this new abstraction.

How does You.i Engine One connect to native platform widgets on a per-device basis?

You.i Engine One does not connect to native widgets. The widgets you see on screen are rendered in a full-screen graphics context, such as OpenGL ES. The intent is that when a user creates a widget such as a button, that button appears and performs consistently across all of their target devices. This also places the emphasis on the brand’s identity, rather than the operating system’s native look and feel.

In the video app space, this isn’t uncommon. Audiences for these apps tend to demand a wider reach of platforms, many of which do not have a strong design language to begin with, and the emphasis on video playback over UI means that this is more acceptable for end users.

On a screen-by-screen basis, it may make sense to write a screen natively. An example of this would be an Apple TV search screen. In this case, the user interacts with a fully-native search screen and keyboard, selects a result, and passes that back to the You.i Engine One layer to continue the user’s journey. While this can be desirable, bear in mind that it fragments the codebase and now this search screen must be written natively for each platform, unless a fallback for You.i Engine One is written to handle remaining platforms.

How does You.i Engine One differentiate between remotes and touch-enabled inputs?

The widgets used on a remote-input device and a touch-input device are identical; it is only the means by which they receive events that changes. On a touch device, user taps are raycast into the scene tree and collision detection determines the intersecting widget. On a remote device, one element is always in focus, and the focus engine maps the cardinal directions to focus changes and the “enter button” to the interaction event.

This means that, in general, a developer is more concerned with connecting to and handling the interaction event, whether it came from touch, an “enter button”, a mouse or pointer click, or even the Enter key on a desktop keyboard. Of course, they can further refine focus or touch behavior as necessary if the default behavior doesn’t align with expectations.

Do platform gatekeepers reject apps built with You.i Engine One? Can I get featured on the platform’s app store?

You.i Engine One apps are fully compliant with the platforms it supports and are routinely featured in those app stores.

How does You.i Engine One integrate native modules?

You.i Engine One exposes a native module system similar to that of React Native. The difference is that native code is first bridged to C++ before being lifted to JavaScript with C++ macros, rather than platform-specific syntax.

How do I connect my third-party libraries?

The process for connecting a third-party library involves writing a native module to wrap the library’s API and linking the library into the build system through CMake. If the library is used on multiple platforms, the API can be aggregated into a common interface at the C++ level before being lifted to JavaScript.

How does You.i Engine One support React Native on Roku?

You.i Engine One supports Roku through our Cloud solution. This is best described by the analogy of a web browser and server. The app and engine are compiled as a Linux executable that runs on a cloud virtual machine. When this machine goes to render, it instead serializes the description of the scene to Roku’s SceneGraph language similar to how a web server produces HTML. The Roku client then receives this description and reproduces it using a combination of SceneGraph and BrightScript, similar to how a web browser renders a web page. Inputs from the end user are forwarded to the server to keep the state in sync, but there is no latency as all the actions are first processed locally and natively.

The connection to this server is established when the app launches. It connects to an API gateway which routes the connection to a machine that isn’t yet at capacity.

Which JavaScript engine do you use?

Similar to Facebook’s React Native, we default to JavaScriptCore as the engine of choice for most platforms. When debugging, this switches to V8 on Chrome, also similar to Facebook’s React Native. On Microsoft devices, we use the Chakra engine.

How much of my code will I be able to share between platforms?

You.i Engine One can get as much as 100% code reuse for smaller apps and samples. Realistically, projects average about 85%. This scales on how many third-party dependencies require platform-specific native modules, and whether there is any intentional fragmentation per platform.

Can I view and modify You.i Engine One’s source code?

At this time You.i Engine One is proprietary closed-source. The C++ headers, Ruby build scripts, JavaScript half of the React Native bindings, and the BrightScript written for the Roku client are all readable.

How does You.i Engine One’s After Effects workflow integrate into React Native?

The content from After Effects is serialized to files using Google’s Protocol Buffers. These can sit resident with the app package, or hosted and served at runtime. When imported, these files are deserialized into scene tree nodes with metadata informing the types of the created nodes.

To integrate this process with React Native, we extended the JSX syntax to include a Composition tag. This tag declares the After Effects composition that should be imported. From there, additional JSX tags—collectively referred to as “Ref” components—allow developers to refer into the composition’s elements and bind to them. As an example, the ButtonRef tag creates a reference to a named layer inside the composition and can bind onClick events. In general, these Ref components attempt to mirror the API and functionality of their respective React Native components. To see this in action, watch the relevant videos on our demo page.

How does You.i Engine One integrate with React Native's shadow tree?

You.i Engine One connects to Facebook’s React Common layer in the React Native codebase. This layer is abstracted from any particular platform and contains C++ code that can connect to the C++ of the engine. When render events come through this layer, the engine uses React Native’s shadow tree to inform how to set up its internal scene tree so that the final result appears how the developer intended. This includes using the Yoga layout library to position elements on screen using Flexbox-like syntax.

How does You.i Engine One connect to the platform's rendering system?

When the app is launched, the engine creates a full screen rendering context using the APIs exposed by the underlying platform. This means OpenGL for desktop, OpenGL ES for most other platforms, DirectX for anything Microsoft, GNM on PlayStation 4, and SceneGraph on Roku.

Nodes in the scene tree contain a mesh and texture which are passed through to the appropriate rendering pipeline. Fonts are rendered using either a rasterized bitmap atlas or a signed-distance field atlas.

For most of these, we have access to the shading pipeline and use shaders to transform vertices, map textures, and produce any effects such as blur. If a developer was so inclined, they could also write custom shaders to produce advanced graphical effects at this layer.

What is the memory footprint of apps built with You.i Engine One?

The footprint is dependent on how much memory the developer allocates, much of which is occupied by textures. On average, apps built with You.i Engine One use about 200Mb. For some devices, such as streaming sticks or low-end Rokus, an app can squeeze in under 30Mb.

How does video playback work on You.i Engine One?

You.i Engine One does not contain a custom playback engine but rather connects to the “obvious” player of each platform. Examples include AVPlayer for Apple, ExoPlayer for Android, and UWP MediaPlayer for UWP. The frames from these playback engines are decoded and presented to a video surface within the rendering layer of You.i Engine One. At the React Native layer, a developer creates an instance of a player using the Video component, which is not dissimilar to the open source react-native-video. Playback controls are created using regular UI controls overlaid on the video surface. Additionally, the native playback controls can be used if desired, similar to the above example with a native search screen.

Can I bring my own player?

Many apps launched with You.i Engine One have integrated a third-party video player. The process for doing this is very similar to the above question on connecting third-party libraries. Optionally, while performing this integration, a developer may want to integrate the video surface as a texture. This allows free transformation and animation of the video surface.

Which streaming formats does You.i Engine One support?

MP4, HLS, Dash, and Smooth are all supported to varying degrees, depending on the underlying player. In general, HLS tends to have the broadest support.

Which DRM schemes does You.i Engine One support?

You.i Engine One pre-integrates FairPlay for Apple and Widevine and PlayReady for everything else. If a project requires any DRM scheme beyond this, it can be integrated as a native module.

If you’d like to dig in deeper and learn more about You.i Engine One, please check out our Product Overview videos or browse our Documentation.

Design and Development FAQs

Here are a few of the more common questions asked about designing and developing with You.i Engine React Native.

What kind of services does the You.i TV design team offer?

The You.i TV design team offers a variety of services. Our specialties include interaction, visual, and motion design. From consulting to branding, we have the experience to drive projects, recognize gaps, and support our partners. We do all of the following:

  • Kickstart projects
  • Adapt brands for interactivity
  • Audit designs
  • Validate designs
  • Offer consulting services
  • Conduct creative brainstorming sessions
  • Conduct feature analysis
  • Create prototypes
  • Conduct design workshops
  • Conduct user testing
  • Adapt designs for cross-platform
Is there workflow documentation?

Heck yes. We’ve accumulated a lot of material over the years. The majority of it can be found in the AE Workflow section of our Portal. We also have a production workflow cheatsheet.

How do you share project files across a team?

At You.i TV, we use GitHub to share project files, but you may decide to use another method. GitHub is a web-based version control repository and Internet hosting service. Finished files are pushed to GitHub, where team members can pull project updates. This enables large teams to work across great distances, and offers an element of safety, as application modifications can be easily merged or reverted.

Designers often use Git desktop clients to commit their updates. SourceTree and GitKraken are the most popular choices. Designers may wish to hand off their project files to developers directly. This can be done through email, Slack, or USB key.

In some cases, there may be multiple people working with project files. To avoid confusion, it’s recommended that project teams use a Baton server. This is a tool that allows users to lock and unlock After Effects project files, preventing merge conflicts. It’s always a good idea to communicate to team members what you’re working on, when you expect to be finished, and if there are any dependencies.

What are the main differences between designing for mobile and 10ft platforms?

While the functionality of an application tends to remain relatively consistent, there are a few concepts that should be considered when designing a cross-platform application. Learn more about designing a 10ft UI here.

Navigation and Interaction

  • Mobile applications rely on touch input, gestures, and swiping
  • 10ft applications are interacted with via remote:
    • Concept of an item being ‘in focus’
    • Dpad navigation typically follows an up/down/left/right model

Screen size and pixel density

  • 1920x1080 looks a lot different on a phone than on a TV
  • Watching video on a TV is a ‘lean back’ experience, while watching video on a mobile device is a ‘lean forward’ experience
  • Font size needs to be adjusted to ensure readability

TV Safe Zone

  • Content close to the edge of a page on a television can get cut off due to overscan
  • As a general guideline, leave a 15% buffer around the outside of a page/view
How flexible are You.i TV applications once they’re built?

Once the functionality has been built, the actual look and feel of the application is highly customizable. Altering aesthetic elements, such as: fonts, colors, size, placement, alignment, orientation, and animations can be updated with little effort. However, if there is a new feature to be built into the application, additional development would be required.

How does You.i TV integrate A/B testing into their apps?

A/B testing can be integrated into a project many different ways. Typically, designers will provide alternate components, which can be swapped within the application via code using an ‘if’ statement, for example: if user = ‘A’, load page A, otherwise load page B. The designer would build pages A and B, then it’s up to the developers to wire the functionality.

What are designers responsible for when building You.i TV apps?

It depends on the role of the designer. At You.i TV, our designers cover a variety of roles and tend to lean from one discipline to another. Ideally, designers are involved in a project from concept to delivery. As product owners, designers put the user first, and are integral to the success of a project.

Interaction Designers:

  • Define feature set
  • Create site maps
  • Create wireframes
  • Define user acceptance criteria
  • Build prototypes

Visual Designers:

  • Apply branding
  • Establish look and feel
  • Design iconography
  • Select fonts
  • Create style guides
  • Produce the visual design (Photoshop, Sketch, etc)

Motion Designers:

  • Create motion style guides
  • Build functional components in skeleton format
  • Apply the visual designs
  • Add interactions and transitions
  • Export, build and test the app
  • Support developers

Quality Assurance:

  • Perform UX reviews
Do you have to build a separate app for each platform?

Nope. We reuse as many components as possible. Additional form factors can be added to a single production file. However, design dictates change. For example, while it’s possible to create one design that will work across a variety of platforms, best practices may require a different layout between handset and 10ft.

Asset Root Locator

Alternate compositions can be stored in unique folders within a project file, thus allowing for designers to alter layout and content between platforms. So long as there are no functional differences between platforms, there is no need to alter code.

Can I use all the tools within After Effects?

Unfortunately not. AE is an amazing tool, and is the industry standard for motion graphics, but not everything is supported. The following features are currently not supported by You.i Engine React Native:

  • Animated masks
  • Expressions
  • Track mattes
  • Blending modes
  • Effects and Presets (with the exception of blur and tint)
What are the limitations of building user interfaces in After Effects?

Designers truly are empowered when leveraging the You.i AE workflow. However, they are still reliant on developers to wire actions to components. For example, a designer can build a play button, but can’t initialize video playback. Designers can define interactions and animations, but may need to rely on developers to fully implement the feature.

How much of an impact does device hardware have?

A major one. While You.i Engine React Native boasts optimized performance across most devices, including multi-threading, view-pooling, and GPU caching, not all hardware is created equally.


  • Memory - this is typically the largest concern, as video and imagery occupy space in memory. The more memory, the more information that can be stored in a cache, and accessed quickly. If memory is running low, app performance can be affected. This is referred to as ‘memory pressure.’ To maximize memory, try to limit the amount of large textures and images.
  • CPU - a heavy load on the central processing unit can cause animations to stutter. Often times we can stagger actions to offset CPU loads. In some cases animations might need to be toned down to play back smoothly on lower end devices.
  • GPU - the better the graphics processing unit, the better the device’s performance. The GPU is dedicated to rendering and storing graphic textures.


  • Design for the highest-end device and scale back animations/features for lower-end devices
  • Attempt to reuse assets to save memory
  • Leverage 9-patch graphics to reduce memory footprint
  • Iterate animations and interactions - test on all platforms and optimize animations accordingly
How do I set up my workspace to build and test applications?

It’s always a good idea to test application features on the physical devices they’re intended for. While it’s okay to test in the preview tool, or some sort of simulator, there is no replacement for the real thing. Different platforms require different setup. In some cases, it might be easier to test an application via TestFlight, HockeyApp, or a similar service.

Do designers need to know how to code to build apps?

No. It always helps to have some background in the industry, but it isn’t required. Designers don’t write code, but they do apply classes and parameters to help define the functionality and behaviors of an application.

If After Effects is for making videos, how can it make apps?

Rather than render movies, the You.i AE plugin enables users to export application data. Since applications are dynamic in nature, the designer exports their project files. Within these exports is all the information required to build the application. Everything from fonts to animations is accounted for.