Provides an interface to the native platform video player.
Refers directly to the CYIAbstractVideoPlayer
After Effects object.
See also:
Remember the following points when you use VideoRef:
import { Composition, VideoRef } from '@youi/react-native-youi';
import { AppRegistry, Button, Stylesheet, Text, View } from 'react-native';
...
<Composition source="VideoRef_MainComp">
<VideoRef
name="VideoSurface"
ref={(ref) => {this.video=ref}}
source={this.state.videoSource}
bufferLength={{
min:5000,
max:15000,
}}
muted={this.state.muted}
mediaPlaybackControlsEnabled={this.state.mediaPlaybackControlsEnabled}
selectedAudioTrack={VideoRef.getAudioTrackId(this.state.audioTracks.map(track => track.id), this.state.selectedAudioTrack)}
selectedClosedCaptionsTrack={VideoRef.getClosedCaptionsTrackId(this.state.closedCaptionTracks.map(track => track.id), this.state.selectedClosedCaptionsTrack)}
maxBitrate={this.state.maxBitrate}
onBufferingStarted={() => console.log("onBufferingStarted called.")}
onBufferingEnded={() => console.log("onBufferingEnded called.")}
onErrorOccurred={(error) => {
this.setState({
videoSource: {
uri: '',
type: '',
headers: {
HeaderName1: "HeaderValue1",
HeaderName2: "HeaderValue2"
},
},
errorCode: error.nativeEvent.errorCode,
nativePlayerErrorCode: error.nativeEvent.nativePlayerErrorCode,
errorMessage: error.nativeEvent.message
})}
}
onPreparing={() => console.log("onPreparing called.") }
onReady={() => console.log("onReady called.")}
onPlaying={() => {this.setState({paused: false}); console.log("onPlaying called.");}}
onPaused={() => {this.setState({paused: true}); console.log("onPaused called.");}}
onPlaybackComplete={() => console.log("onPlaybackComplete called.")}
onFinalized={() => console.log("onFinalized called.")}
onCurrentTimeUpdated={(currentTime) => {
this.setState({
currentTime: currentTime
})}
}
onDurationChanged={(duration) => {
this.setState({
duration: duration
})}
}
onStateChanged={(playerState) => {
this.setState({
playbackState: playerState.nativeEvent.playbackState,
mediaState: playerState.nativeEvent.mediaState
})}
}
onAvailableAudioTracksChanged={(audioTracks) => {
tempAudioTracksArray = []
audioTracks.nativeEvent.forEach(element => {
tempAudioTracksArray.push({
id: element.id,
name: element.name,
language: element.language,
valid: element.valid
})
});
this.setState({
audioTracks: tempAudioTracksArray
})
// If there are not enough audio tracks for the selected track, set to the default first track.
if(this.state.selectedAudioTrack >= audioTracks.nativeEvent.length)
{
this.setState({
selectedAudioTrack: 0
})
}}
}
onAvailableClosedCaptionsTracksChanged={(closedCaptionTracks) => {
tempClosedCaptionTracksArray = []
closedCaptionTracks.nativeEvent.forEach(element => {
tempClosedCaptionTracksArray.push({
id: element.id,
name: element.id == VideoRef.getClosedCaptionsOffId() && element.name == "" ? OFF_TRACK_NAME : element.name,
language: element.language
})
});
this.setState({
closedCaptionTracks: tempClosedCaptionTracksArray
})
// If there are not enough closed captions tracks for the selected track, disable closed captions.
if(this.state.selectedClosedCaptionsTrack >= closedCaptionTracks.nativeEvent.length)
{
this.setState({
selectedClosedCaptionsTrack: -1
})
}
// If at this point, closed captions are indicated to be disabled with 'selectedClosedCaptionsTrack = -1', set it to the position of the disabled track in the array, for simpler tracking of the selected track position.
if(this.state.selectedClosedCaptionsTrack < 0)
{
this.setState({
selectedClosedCaptionsTrack: tempClosedCaptionTracksArray.map(track => track.id).indexOf(VideoRef.getClosedCaptionsOffId())
})
}}
}
/>
</Composition>
static getAudioTrackId(audioTrackIds, selectedAudioTrack)
Gets an audio track ID based on an array of audio track objects, and the selected index among those tracks.
static getClosedCaptionsTrackId(closedCaptionsTrackIds, selectedClosedCaptionsTrack)
Gets a closed captions track ID based on an array of closed caption track objects, and the selected index among those tracks.
static getClosedCaptionsOffId()
Returns the constant ID corresponding to closed captions disabled.
Additional video restrictions apply when implementing a You.i React Native app for deployment on a Roku app. In a Roku app:
Because of these circumstances, additional restrictions exist when implementing functionality based on video player state and events. For details, contact your You.i TV representative.
Roku may have particular requirements for implementing bookmarking, ads, and other such functionality. Ensure you meet all applicable Roku requirements if you are implementing a You.i React Native for deployment on a Roku app.
name
After Effects layer name
Type | Required |
---|---|
String | Yes |
source
The video source; either a remote URL or a local file resource.
Type | Required |
---|---|
object | Yes |
type
: stringuri
: objectdrmScheme
: stringstartTimeMs
: number.
The time, in milliseconds, from where the video asset should start playing.
If excluded, the video asset starts from the beginning and a live stream starts from the live head.
It is also supported for Roku.headers
: object.
Custom HTTP headers for video asset requests.
There are no specifically required properties.
Each property of the object represents an HTTP header, where the name of the property is the name of the header and the value of the property is the value of header.
The headers
property is currently only supported on PS4, but support on more target platforms is planned for the future releases.
If you use this property on a target platform where it’s not supported, you’ll see the following warning in the console log:
The `headers` property of the video source was not empty, but this platform does not support custom headers in video HTTP requests.
drmInfo
: object
The following is required for each supported DRM scheme:
none
: nullfairplay
: nullwidevine_modular
: object
Example:
{
licenseAcquisitionUrl: string,
headers: array,
{
key: string
value: string
}
}
playready
: stringmuted
Determines whether the video plays audio or not.
Default is false
, which always plays audio.
Type | Required |
---|---|
bool | No |
visible
Determines whether the component should be visible or not.
Default is true
.
Note the following:
Type | Required |
---|---|
bool | No |
bufferLength
Customize the video playback buffer length.
Type | Required |
---|---|
object | no |
min
: number.
The value, in milliseconds, that the video player should not to allow the buffer length to fall below.max
: number.
The value, in milliseconds, that the video player should not let the buffer length exceed.Currently supported for Android. The buffer length must be set before preparing the media.
initialBandwidthHint
Sets the initial bandwidth hint to the number specified, in bits per second. The next time the player is prepared, the hint selects an initial playback variant with a suitable bitrate, if streaming from an adaptive media source. By default, no hint is specified and the player selects the initial playback variant using its default logic. It is supported on Android, iOS, tvOS, and Roku.
Type | Required |
---|---|
number | No |
selectedAudioTrack
Switches the video player’s audio to the track indicated by the given index. If the video is playing, playback isn’t stopped during the switch.
Type | Required |
---|---|
number | No |
selectedClosedCaptionsTrack
Switches the video player’s closed captions track to the track indicated by the given index. If the video is playing, playback isn’t stopped during the switch.
Type | Required |
---|---|
number | No |
userAgent
Sets the user agent to be used by the player for requests on manifests and video segments.
Prepare()
.
Otherwise, setting the user agent has no effect on current playback.""
).Type | Required |
---|---|
string | No |
maxBitrate
Sets the maximum bitrate to the number specified, in bits per second. The player then uses this value when streaming from an adaptive media source. By default, no bitrate restriction is applied.
maxBitrate
is set during playback, it won’t take effect until content that’s already been buffered has played back.maxBitrate
to 0
removes any bitrate restrictions.maxBitrate
must be set before preparing the media.MaxBandwidth
metadata attribute before preparing media and starting playback.Type | Required |
---|---|
number | No |
mediaPlaybackControlsEnabled
Enables or disables an Android Media Session, allowing the player to automatically handle media playback controls.
false
.Voice controls through assistants like Google Assistant and Amazon Alexa and some media controls (such as headphone controls) are only available through the Android Media Session and aren’t provided as key events.
Due to this limitation, when mediaPlaybackControlsEnabled
is set to false there is no way to handle these media events via You.i Platform’s API.
When mediaPlaybackControlsEnabled
is false (default), media playback controls aren’t automatically handled and must have appropriate app-side logic.
Media key events bubble up to your application, where you must implement media playback control logic.
See the note above for important limitations.
When mediaPlaybackControlsEnabled
is true, media playback controls are automatically handled by native player logic.
No app-side logic is required to handle media playback events.
Media key events are suppressed and aren’t bubbled up to the app.
To add custom handling in this case, use the related prop mediaPlaybackControlsHandlers
.
Your application can choose to:
mediaPlaybackControlsHandlers
)For example, let’s assume that you want to prevent a user from using seek (fast forward) when an advertisement is showing. You can choose to conditionally disable automatic media playback controls and trap key events that attempt to skip ahead. However, recalling the notice above about voice assistants, this may also prevent your user from pausing/playing the video via the audio or headphone controls they could use prior to viewing the ad.
If you want to conditionally manage player controls, we suggest you set mediaPlaybackControlsEnabled
to true and register handlers with mediaPlaybackControlsHandlers
.
Your handlers are called whenever the user accesses player controls, allowing you to programmatically determine what to do.
Type | Required |
---|---|
boolean | No |
Note that our testing shows that some audio and headphone devices forward media player actions as key events. The forwarding of these events is outside the engine’s control. To your application they look like any other play, pause, or stop event.
mediaPlaybackControlsHandlers
Allows you to pass in an object containing actions to perform when seek, play, pause, or stop is called from an Android 10-foot remote.
For example, this prop allows you to prevent a seek
from occurring while an ad is playing, but you can still execute some code when the remote controls are received.
Note that these intercepts are called only if the prop mediaPlaybackControlsEnabled
is set to true.
Type | Required |
---|---|
object | No |
Example:
mediaPlaybackControlsHandlers={{
seek: (seekPoint) => {
if(isAdPlaying()) {
console.log('Ad is playing, prevent seeking');
} else {
// Allow the seek to happen
this.seek(seekPoint);
}
},
play: () => {
console.log('Play command has been intercepted.')
},
pause: () => {
console.log('Pause command has been intercepted.')
},
stop: () => {
console.log('Stop command has been intercepted.')
}
}}
metadata
Sets the video player metadata. For Roku, this prop must be used if your application uses custom DRM attributes. See Video DRM for Roku.
Type | Required |
---|---|
object | No |
onBufferingStarted
Invoked when player buffering started. An application can display a buffering indicator when this signal is emitted. Currently, it is not supported for the Roku target platform.
Type | Required |
---|---|
function | No |
onBufferingEnded
Invoked when player buffering ended. An application can clear a buffering indicator, if one has been displayed, when this signal is emitted. The video either is in a play or paused state depending on the state the player was in prior to buffering. Currently, it is not supported for the Roku target platform.
Type | Required |
---|---|
function | No |
onErrorOccurred
Invoked when the player reports an error with:
{
nativeEvent: {
errorCode, nativePlayerErrorCode, message;
}
}
Your application can display an error notification when this signal is emitted.
You can also use the resulting error code for application logic.
When using ExoPlayer with Android, the value for nativePlayerErrorCode
will always be empty; all errors are returned via message
.
Type | Required |
---|---|
function | No |
onPreparing
Invoked when the player has started preparing a video for playback. Occurs only when the target platform supports the provided video format. Currently, it is not supported for the Roku target platform.
Type | Required |
---|---|
function | No |
onReady
Invoked when the video is ready to be interacted with and information about the video can be queried. Currently, it is not supported for the Roku target platform.
Type | Required |
---|---|
function | No |
onCompositionDidLoad
Invokes when parent composition is loaded successfully and the Ref component gets attached to it.
onPlaying
Invoked when playback has begun. This can either be the initial playback or playback resuming after paused or buffering states.
Type | Required |
---|---|
function | No |
onPaused
Invoked when playback has paused.
Type | Required |
---|---|
function | No |
onPlaybackComplete
Invoked when the playback of the currently loaded video media has completed. After this signal is emitted the player is in the paused state.
You can transition back into the playing state by using a Seek()
operation after this signal is emitted.
This is invoked each time the video media reaches the end of stream.
Type | Required |
---|---|
function | No |
onFinalized
Invoked once the player has shut down and been cleaned up.
This signal is sent on Stop()
.
However, in order for onFinalized
to be triggered, the video player must be finished preparing.
After onReady()
has been called, the video player is ready to be interacted with.
onPlaybackComplete
can be used instead for Roku.Type | Required |
---|---|
function | No |
onCurrentTimeUpdated
Invoked when the current time in the video playback has updated. The callback returns a number; the current time of the video playback in ms.
Type | Required |
---|---|
function | No |
onCurrentTimeUpdatedThresholdMs
Returns how often the current time in the video playback should be updated. The callback returns a number; the threshold time in ms.
Type | Required |
---|---|
function | No |
For Roku with You.i Roku Cloud, you can set it using a metadata with the BookmarkInterval
key before preparing and starting the playback.
onDurationChanged
Invoked when the duration of the loaded video media has changed.
This typically occurs after loading new video media.
You.i TV recommends that you update the duration visible to the application user when this is invoked.
The callback returns a number, which represents the duration of the video in ms.
For example:
onDurationChanged={(duration) => {}}
Type | Required |
---|---|
function | No |
onStateChanged
Invoked when the player state has changed.
Type | Required |
---|---|
function | No |
onAvailableAudioTracksChanged
Invoked when the available audio tracks for the current media have changed. This may be emitted at any point.
Type | Required |
---|---|
function | No |
onAvailableClosedCaptionsTracksChanged
Invoked when the available closed captions tracks for the current media have changed. This may be emitted at any point.
Type | Required |
---|---|
function | No |
onTimedMetadata
Invoked when timed metadata is available.
Platforms supported under this prop include iOS, tvOS, macOS, Android, PS4, UWP, Roku, and Tizen. Please note that the metadata reported for Tizen doesn’t exactly match that reported by other supported platforms.
Type | Required |
---|---|
dictionary | No |
The dictionary will have the following content:
Name | Type | |||
---|---|---|---|---|
identifier |
string | Yes | Identifier for the metadata content. Possible values are TXXX (for user-defined text information frames) and PRIV (for private frames). |
For Cloud Solution, identifier is called data . |
value |
string | Yes | The metadata value. The data in the object returned varies, depending on the identifier. | For Cloud Solution, the app receives the raw onTimedMetadata dictionary, which is value , and converts it to a string. |
additionalData |
object | Yes, if the identifier is PRIV . |
Allows identification of the private frame owner. |
Currently supported only for AVPlayer (iOS, tvOS, and macOS). Format as follows: additionalData: {ID3PrivateFrameOwner: "string value"} where string value is the owner identifier part of the ID3 Private Frame. |
timestamp |
number (microseconds) | Yes | The time at which the metadata is located within the media being played. | For Cloud Solution, the_decodeInfo_pts inside the onTimedMetadata JSON dictionary represents timestamp . |
duration |
number (microseconds) | Yes | The time duration for which the metadata is relevant within the media being played. If the duration is not specified, it will be set to a value larger than the complete duration of the media. | For Cloud Solution, theEventDuration inside the onTimedMetadata JSON dictionary represents duration . |
For Cloud Solution, the identifier
is called data
and the value
is the raw onTimedMetadata
JSON dictionary, which is received by the app.
The app parses the JSON dictionary and converts it to a string.
The following JSON dictionary is an example of what a Cloud Solution app will receive as an identifier
and value
:
{
"data": {
"_decodeInfo_pts": 1.214,
"EventDuration": 65535,
"Id": 0123456789,
"MessageData": "ID3\u0001",
"MessageDataBase16": "494432207B426C616820426C61687D",
"Offset": 0,
"SchemeID": "www.abc.com:id3:v1",
"Source": "emsg",
"Timescale": 90000,
"Value": "1",
"XlinkHref": ""
},
"event": "timedMetadata"
}
onAudioBitrateChanged
Invoked when the audio bitrate has changed. This may be emitted at any time. The callback passes a new bitrate in kbps.
Type | Required |
---|---|
function | No |
onVideoBitrateChanged
Invoked when the video bitrate has changed. This may be emitted at any time. The callback passes a new bitrate in kbps.
Type | Required |
---|---|
function | No |
onTotalBitrateChanged
Invoked when the combined video and audio bitrate has changed. This may be emitted at any time. The callback passes a new bitrate in kbps.
Type | Required |
---|---|
function | No |
testID
A unique identifier for this element to be used in UI automation testing scripts.
Type | Required |
---|---|
string | No |
seek()
seek(positionInMS : number)
Seek the video to a specific position in the video media. When seeking backwards from playback complete, the player must remain paused.
Parameters:
Name | Type | Required | Description |
---|---|---|---|
positionInMS | number | Yes | Video position to seek to |
Ensure that positionInMs
is a whole number, not a decimal number, to properly execute the seek()
function.
play()
play()
Starts the playback of the prepared video asset, or resumes playback if the player is paused.
If you are using a customized Android manifest file for the Android target platform, you need to add the FOREGROUND_SERVICE
to the manifest file as shown below:
<uses-permission android:name="android.permission.FOREGROUND_SERVICE"/>
For those apps using the template from You.i Platform, it will be automatically taken care of.
pause()
pause()
Pauses video playback.
stop()
stop()
Stops video playback. Allows users to switch between multiple videos and unload media.
getStatistics()
getStatistics()
Returns a promise for the latest statistics from the player:
isLive
: boolean
totalBitrateKbps
: numbervideoBitrateKbps
: numberaudioBitrateKbps
: numberdefaultTotalBitrateKbps
: numberdefaultVideoBitrateKbps
: numberdefaultAudioBitrateKbps
: number
bufferLengthMs
: numberminimumBufferLengthMs
: numberframesPerSecond
: numberCurrently, it is not supported for the Roku target platform.
getPlayerInformation()
getPlayerInformation()
Returns a promise for the following data about the player:
name
: stringversion
: stringgetLiveSeekableRanges()
getLiveSeekableRanges()
Returns a promise with an array of objects that consists the valid time ranges, start time(ms) and end time(ms), within the current stream that are valid to seek to. Each element in the array has the following fields:
startTimeMs
: numberendTimeMs
: numberDepending on the stream, there is no guarantee that the time ranges are continuous. If the seekable time ranges of a live stream are unknown, an empty array is returned.
Only available for live streaming on all supported target platforms, except Roku.