VideoRef

Provides an interface to the native platform video player. Refers directly to the CYIAbstractVideoPlayer After Effects object.

See also:

Example Usage

import { Composition, VideoRef } from '@youi/react-native-youi';
import { AppRegistry, Button, Stylesheet, Text, View } from 'react-native';
...
<Composition source="VideoRef_MainComp">
 <VideoRef
  name="VideoSurface"
  ref={(ref) => {this.video=ref}}
  source={this.state.videoSource}
  bufferLength={{
    min:5000,
    max:15000,
    }}
  muted={this.state.muted}
               mediaPlaybackControlsEnabled={this.state.mediaPlaybackControlsEnabled}
  selectedAudioTrack={VideoRef.getAudioTrackId(this.state.audioTracks.map(track => track.id), this.state.selectedAudioTrack)}
  selectedClosedCaptionsTrack={VideoRef.getClosedCaptionsTrackId(this.state.closedCaptionTracks.map(track => track.id), this.state.selectedClosedCaptionsTrack)}
  maxBitrate={this.state.maxBitrate}
  onBufferingStarted={() => console.log("onBufferingStarted called.")}
  onBufferingEnded={() => console.log("onBufferingEnded called.")}
  onErrorOccurred={(error) => {
   this.setState({
    videoSource: {
     uri: '',
     type: '',
     headers: {
      HeaderName1: "HeaderValue1",
      HeaderName2: "HeaderValue2"
     },
    },
    errorCode: error.nativeEvent.errorCode,
    nativePlayerErrorCode: error.nativeEvent.nativePlayerErrorCode,
    errorMessage: error.nativeEvent.message
   })}
  }
  onPreparing={() => console.log("onPreparing called.") }
  onReady={() => console.log("onReady called.")}
  onPlaying={() => {this.setState({paused: false}); console.log("onPlaying called.");}}
  onPaused={() => {this.setState({paused: true}); console.log("onPaused called.");}}
  onPlaybackComplete={() => console.log("onPlaybackComplete called.")}
  onFinalized={() => console.log("onFinalized called.")}
  onCurrentTimeUpdated={(currentTime) => {
   this.setState({
    currentTime: currentTime
   })}
  }
  onDurationChanged={(duration) => {
   this.setState({
    duration: duration
   })}
  }
  onStateChanged={(playerState) => {
   this.setState({
    playbackState: playerState.nativeEvent.playbackState,
    mediaState: playerState.nativeEvent.mediaState
   })}
  }
  onAvailableAudioTracksChanged={(audioTracks) => {
   tempAudioTracksArray = []
   audioTracks.nativeEvent.forEach(element => {
    tempAudioTracksArray.push({
     id: element.id,
     name: element.name,
     language: element.language,
     valid: element.valid
    })
   });
   this.setState({
    audioTracks: tempAudioTracksArray
   })
   // If there are not enough audio tracks for the selected track, set to the default first track.
   if(this.state.selectedAudioTrack >= audioTracks.nativeEvent.length)
   {
    this.setState({
     selectedAudioTrack: 0
    })
   }}
  }
  onAvailableClosedCaptionsTracksChanged={(closedCaptionTracks) => {
   tempClosedCaptionTracksArray = []
   closedCaptionTracks.nativeEvent.forEach(element => {
    tempClosedCaptionTracksArray.push({
     id: element.id,
     name: element.id == VideoRef.getClosedCaptionsOffId() && element.name == "" ? OFF_TRACK_NAME : element.name,
     language: element.language
    })
   });
   this.setState({
    closedCaptionTracks: tempClosedCaptionTracksArray
   })
   // If there are not enough closed captions tracks for the selected track, disable closed captions.
   if(this.state.selectedClosedCaptionsTrack >= closedCaptionTracks.nativeEvent.length)
   {
    this.setState({
     selectedClosedCaptionsTrack: -1
    })
   }
   // If at this point, closed captions are indicated to be disabled with 'selectedClosedCaptionsTrack = -1', set it to the position of the disabled track in the array, for simpler tracking of the selected track position.
   if(this.state.selectedClosedCaptionsTrack < 0)
   {
    this.setState({
     selectedClosedCaptionsTrack: tempClosedCaptionTracksArray.map(track => track.id).indexOf(VideoRef.getClosedCaptionsOffId())
    })
   }}
  }
 />
</Composition>

Props

Methods

Helper Functions

  • static getAudioTrackId(audioTrackIds, selectedAudioTrack)

    Gets an audio track ID based on an array of audio track objects, and the selected index among those tracks.

  • static getClosedCaptionsTrackId(closedCaptionsTrackIds, selectedClosedCaptionsTrack)

    Gets a closed captions track ID based on an array of closed caption track objects, and the selected index among those tracks.

  • static getClosedCaptionsOffId()

    Returns the constant ID corresponding to closed captions disabled.

You.i Roku Cloud Considerations

Additional video restrictions apply when implementing a You.i React Native app for deployment on a Roku app. In a Roku app:

  • Video is streamed directly from the local Roku player.
  • All playback and associated controls are handled by the local Roku player.
  • Cloud Solution playback currently supports only full screen video mode.

Because of these circumstances, additional restrictions exist when implementing functionality based on video player state and events. For details, contact your You.i TV representative.

Roku may have particular requirements for implementing bookmarking, ads, and other such functionality. Ensure you meet all applicable Roku requirements if you are implementing a You.i React Native for deployment on a Roku app.


Reference

Prop: name

After Effects layer name

Type Required
String Yes

Prop: source

The video source; either a remote URL or a local file resource.

Type Required
object Yes
  • type: string
  • uri: object
  • drmScheme: string
  • startTimeMs: number. The time, in milliseconds, from where the video asset should start playing. If excluded, the video asset starts from the beginning and a live stream starts from the live head. It is also supported for Roku.
  • headers: object. Custom HTTP headers for video asset requests. There are no specifically required properties. Each property of the object represents an HTTP header, where the name of the property is the name of the header and the value of the property is the value of header.

  • drmInfo: object

    The following is required for each supported DRM scheme:

    • none: null
    • fairplay: null
    • widevine_modular: object

      Example:

      {
         licenseAcquisitionUrl: string,
         headers: array,
         {
            key: string
            value: string
         }
      }
      
    • playready: string

Prop: muted

Determines whether the video plays audio or not. Default is false, which always plays audio.

Type Required
bool No

Prop: visible

Determines whether the component should be visible or not. Default is true. Note the following:

  • Component visibility is different from component opacity.
  • Components that are not visible are not focusable.
Type Required
bool No

Prop: bufferLength

Customize the video playback buffer length.

Type Required
object no
  • min: number. The value, in milliseconds, that the video player should not to allow the buffer length to fall below.
  • max: number. The value, in milliseconds, that the video player should not let the buffer length exceed.

Prop: initialBandwidthHint

Sets the initial bandwidth hint to the number specified, in bits per second. The next time the player is prepared, the hint selects an initial playback variant with a suitable bitrate, if streaming from an adaptive media source. By default, no hint is specified and the player selects the initial playback variant using its default logic. It is supported on Android, iOS, tvOS, and Roku.

Type Required
number No

Prop: selectedAudioTrack

Switches the video player’s audio to the track indicated by the given index. If the video is playing, playback isn’t stopped during the switch.

Type Required
number No

Prop: selectedClosedCaptionsTrack

Switches the video player’s closed captions track to the track indicated by the given index. If the video is playing, playback isn’t stopped during the switch.

Type Required
number No

Prop: userAgent

Sets the user agent to be used by the player for requests on manifests and video segments.

Notes

  • Currently supported for Android, PS4, Roku Cloud Solution, and XBox One.
  • The default user agent varies per platform.
  • Set the user agent before calling Prepare(). Otherwise, setting the user agent has no effect on current playback.
  • To reset the default, set the user agent to an empty string ("").

Limitations for Roku Cloud Solution

  • Once a custom User-Agent header is set, the Roku native player caches it and uses it for all video requests. Once the device has been powered off or the app has been reinstalled, the default User-Agent header is restored, until you set a custom value again. You can replace one custom User-Agent header with another between videos, but it’s not possible to revert to the default header unless the device is powered off or the app is reinstalled.
  • For some videos, the initial request uses the custom header, but segment fetches use the default. There is currently no workaround for this issue.
Type Required
string No

Prop: maxBitrate

Sets the maximum bitrate to the number specified, in bits per second. The player then uses this value when streaming from an adaptive media source. By default, no bitrate restriction is applied.

Notes

  • If maxBitrate is set during playback, it won’t take effect until content that’s already been buffered has played back.
  • Setting maxBitrate to 0 removes any bitrate restrictions.
  • For Tizen, maxBitrate must be set before preparing the media.
  • Currently not supported for Roku. For Roku, set the MaxBandwidth metadata attribute before preparing media and starting playback.
Type Required
number No

Prop: mediaPlaybackControlsEnabled

Enables or disables an Android Media Session, allowing the player to automatically handle media playback controls.

  • Default value is false.
  • Applies to Android only.

When mediaPlaybackControlsEnabled is false (default), media playback controls aren’t automatically handled and must have appropriate app-side logic. Media key events bubble up to your application, where you must implement media playback control logic. See the note above for important limitations.

When mediaPlaybackControlsEnabled is true, media playback controls are automatically handled by native player logic. No app-side logic is required to handle media playback events. Media key events are suppressed and aren’t bubbled up to the app. To add custom handling in this case, use the related prop mediaPlaybackControlsHandlers.

Your application can choose to:

  • always manage media keys directly (set the prop to false)
  • manage media keys depending on context in your application (set the prop to false only when needed)
  • use the Media Session to always manage controls automatically (set the prop to true)
  • augment the automatic Media Session controls with a set of handlers (set the prop to true and see mediaPlaybackControlsHandlers)

For example, let’s assume that you want to prevent a user from using seek (fast forward) when an advertisement is showing. You can choose to conditionally disable automatic media playback controls and trap key events that attempt to skip ahead. However, recalling the notice above about voice assistants, this may also prevent your user from pausing/playing the video via the audio or headphone controls they could use prior to viewing the ad.

If you want to conditionally manage player controls, we suggest you set mediaPlaybackControlsEnabledto true and register handlers with mediaPlaybackControlsHandlers. Your handlers are called whenever the user accesses player controls, allowing you to programmatically determine what to do.

Type Required
boolean No

Note that our testing shows that some audio and headphone devices forward media player actions as key events. The forwarding of these events is outside the engine’s control. To your application they look like any other play, pause, or stop event.


Prop: mediaPlaybackControlsHandlers

Allows you to pass in an object containing actions to perform when seek, play, pause, or stop is called from an Android 10-foot remote. For example, this prop allows you to prevent a seek from occurring while an ad is playing, but you can still execute some code when the remote controls are received. Note that these intercepts are called only if the prop mediaPlaybackControlsEnabled is set to true.

Type Required
object No

Example:

mediaPlaybackControlsHandlers={{
    seek: (seekPoint) => {
      if(isAdPlaying()) {
          console.log('Ad is playing, prevent seeking');
      } else {
          // Allow the seek to happen
          this.seek(seekPoint);
      }
    },
    play: () => {
        console.log('Play command has been intercepted.')
    },
    pause: () => {
        console.log('Pause command has been intercepted.')
    },
    stop: () => {
        console.log('Stop command has been intercepted.')
    }
  }}

Prop: metadata

Sets the video player metadata. For Roku, this prop must be used if your application uses custom DRM attributes. See Video DRM for Roku.

Type Required
object No

Prop: onBufferingStarted

Invoked when player buffering started. An application can display a buffering indicator when this signal is emitted. Currently, it is not supported for the Roku target platform.

Type Required
function No

Prop: onBufferingEnded

Invoked when player buffering ended. An application can clear a buffering indicator, if one has been displayed, when this signal is emitted. The video either is in a play or paused state depending on the state the player was in prior to buffering. Currently, it is not supported for the Roku target platform.

Type Required
function No

Prop: onErrorOccurred

Invoked when the player reports an error with:

{
  nativeEvent: {
    errorCode, nativePlayerErrorCode, message;
  }
}

Your application can display an error notification when this signal is emitted. You can also use the resulting error code for application logic. When using ExoPlayer with Android, the value for nativePlayerErrorCode will always be empty; all errors are returned via message.

Type Required
function No

Prop: onPreparing

Invoked when the player has started preparing a video for playback. Occurs only when the target platform supports the provided video format. Currently, it is not supported for the Roku target platform.

Type Required
function No

Prop: onReady

Invoked when the video is ready to be interacted with and information about the video can be queried. Currently, it is not supported for the Roku target platform.

Type Required
function No

Prop: onCompositionDidLoad

Invokes when parent composition is loaded successfully and the Ref component gets attached to it.


Prop: onPlaying

Invoked when playback has begun. This can either be the initial playback or playback resuming after paused or buffering states.

Type Required
function No

Prop: onPaused

Invoked when playback has paused.

Type Required
function No

Prop: onPlaybackComplete

Invoked when the playback of the currently loaded video media has completed. After this signal is emitted the player is in the paused state.

Type Required
function No

Prop: onFinalized

Invoked once the player has shut down and been cleaned up. This signal is sent on Stop(). However, in order for onFinalized to be triggered, the video player must be finished preparing. After onReady() has been called, the video player is ready to be interacted with.

Notes

  • On PlayStation 4, the player stalls if deleted or prepared while already being stopped. You.i TV recommends that you listen for this signal before doing so.
  • Currently, this prop is not supported for the Roku target platform, but onPlaybackComplete can be used instead for Roku.
Type Required
function No

Prop: onCurrentTimeUpdated

Invoked when the current time in the video playback has updated. The callback returns a number; the current time of the video playback in ms.

Type Required
function No

Prop: onCurrentTimeUpdatedThresholdMs

Returns how often the current time in the video playback should be updated. The callback returns a number; the threshold time in ms.

Type Required
function No

Prop: onDurationChanged

Invoked when the duration of the loaded video media has changed. This typically occurs after loading new video media. You.i TV recommends that you update the duration visible to the application user when this is invoked. The callback returns a number, which represents the duration of the video in ms. For example: onDurationChanged={(duration) => {}}

Type Required
function No

Prop: onStateChanged

Invoked when the player state has changed.

Type Required
function No

Prop: onAvailableAudioTracksChanged

Invoked when the available audio tracks for the current media have changed. This may be emitted at any point.

Type Required
function No

Prop: onAvailableClosedCaptionsTracksChanged

Invoked when the available closed captions tracks for the current media have changed. This may be emitted at any point.

Type Required
function No

Prop: onTimedMetadata

Invoked when timed metadata is available.

Type Required
dictionary No

The dictionary will have the following content:

Name Type      
identifier string Yes Identifier for the metadata content. Possible values are TXXX (for user-defined text information frames) and PRIV (for private frames). For Cloud Solution, identifier is called data.
value string Yes The metadata value. The data in the object returned varies, depending on the identifier. For Cloud Solution, the app receives the raw onTimedMetadata dictionary, which is value, and converts it to a string.
additionalData object Yes, if the identifier is PRIV.
Allows identification of the private frame owner.
Currently supported only for AVPlayer (iOS, tvOS, and macOS).
Format as follows:
additionalData: {ID3PrivateFrameOwner: "string value"}
where string value is the owner identifier part of the ID3 Private Frame.
timestamp number (microseconds) Yes The time at which the metadata is located within the media being played. For Cloud Solution, the_decodeInfo_pts inside the onTimedMetadata JSON dictionary represents timestamp.
duration number (microseconds) Yes The time duration for which the metadata is relevant within the media being played. If the duration is not specified, it will be set to a value larger than the complete duration of the media. For Cloud Solution, theEventDuration inside the onTimedMetadata JSON dictionary represents duration.

For Cloud Solution Only

For Cloud Solution, the identifier is called data and the value is the raw onTimedMetadata JSON dictionary, which is received by the app. The app parses the JSON dictionary and converts it to a string. The following JSON dictionary is an example of what a Cloud Solution app will receive as an identifier and value:

{
  "data": {
    "_decodeInfo_pts": 1.214,
    "EventDuration": 65535,
    "Id": 0123456789,
    "MessageData": "ID3\u0001",
    "MessageDataBase16": "494432207B426C616820426C61687D",
    "Offset": 0,
    "SchemeID": "www.abc.com:id3:v1",
    "Source": "emsg",
    "Timescale": 90000,
    "Value": "1",
    "XlinkHref": ""
  },
  "event": "timedMetadata"
}

Prop: onAudioBitrateChanged

Invoked when the audio bitrate has changed. This may be emitted at any time. The callback passes a new bitrate in kbps.

Type Required
function No

Prop: onVideoBitrateChanged

Invoked when the video bitrate has changed. This may be emitted at any time. The callback passes a new bitrate in kbps.

Type Required
function No

Prop: onTotalBitrateChanged

Invoked when the combined video and audio bitrate has changed. This may be emitted at any time. The callback passes a new bitrate in kbps.

Type Required
function No

Prop: testID

A unique identifier for this element to be used in UI automation testing scripts.

Type Required
string No

Method: seek()

seek(positionInMS : number)

Seek the video to a specific position in the video media. When seeking backwards from playback complete, the player must remain paused.

Parameters:

Name Type Required Description
positionInMS number Yes Video position to seek to

Method: play()

play()

Starts the playback of the prepared video asset, or resumes playback if the player is paused.


Method: pause()

pause()

Pauses video playback.


Method: stop()

stop()

Stops video playback. Allows users to switch between multiple videos and unload media.


Method: getStatistics()

getStatistics()

Returns a promise for the latest statistics from the player:

  • isLive: boolean

  • totalBitrateKbps: number
  • videoBitrateKbps: number
  • audioBitrateKbps: number
  • defaultTotalBitrateKbps: number
  • defaultVideoBitrateKbps: number
  • defaultAudioBitrateKbps: number

  • bufferLengthMs: number
  • minimumBufferLengthMs: number
  • framesPerSecond: number

Currently, it is not supported for the Roku target platform.


Method: getPlayerInformation()

getPlayerInformation()

Returns a promise for the following data about the player:

  • name: string
  • version: string

Method: getLiveSeekableRanges()

getLiveSeekableRanges()

Returns a promise with an array of objects that consists the valid time ranges, start time(ms) and end time(ms), within the current stream that are valid to seek to. Each element in the array has the following fields:

  • startTimeMs: number
  • endTimeMs: number

Depending on the stream, there is no guarantee that the time ranges are continuous. If the seekable time ranges of a live stream are unknown, an empty array is returned.