IMediaEngine
The IMediaEngine class.
addListener
Adds one IMediaEngineEvent listener.
addListener?<EventType extends keyof IMediaEngineEvent>( eventType: EventType, listener: IMediaEngineEvent[EventType] ): void;
Details
After calling this method, you can listen for the corresponding events in the IMediaEngine object and obtain data through IMediaEngineEvent. Depending on your project needs, you can add multiple listeners for the same event.
Parameters
- eventType
- The name of the target event to listen for. See IMediaEngineEvent.
- listener
- The callback function for eventType. Take adding a listener for onPlaybackAudioFrameBeforeMixing as an example:
const onPlaybackAudioFrameBeforeMixing = (channelId: string, uid: number, audioFrame: AudioFrame) => {}; engine.addListener('onPlaybackAudioFrameBeforeMixing', onPlaybackAudioFrameBeforeMixing);
createCustomAudioTrack
Creates a custom audio track.
abstract createCustomAudioTrack( trackType: AudioTrackType, config: AudioTrackConfig ): number;
Details
- Call this method to create a custom audio track and get the audio track ID.
- Call joinChannel to join the channel. In ChannelMediaOptions, set publishCustomAudioTrackId to the audio track ID that you want to publish, and set publishCustomAudioTrack to
true
. - Call pushAudioFrame and specify trackId as the audio track ID set in step 2. You can then publish the corresponding custom audio source in the channel.
Parameters
- trackType
- The type of the custom audio track. See AudioTrackType.Attention: If AudioTrackDirect is specified for this parameter, you must set
publishMicrophoneTrack
tofalse
in ChannelMediaOptions when calling joinChannel to join the channel; otherwise, joining the channel fails and returns the error code -2. - config
- The configuration of the custom audio track. See AudioTrackConfig.
Returns
- If the method call is successful, the audio track ID is returned as the unique identifier of the audio track.
- If the method call fails, 0xffffffff is returned.
destroyCustomAudioTrack
Destroys the specified audio track.
abstract destroyCustomAudioTrack(trackId: number): number;
Details
Parameters
- trackId
- The custom audio track ID returned in createCustomAudioTrack.
Returns
- 0: Success.
- < 0: Failure.
pullAudioFrame
Pulls the remote audio data.
abstract pullAudioFrame(): AudioFrame;
After a successful call of this method, the app pulls the decoded and mixed audio data for playback.
Call timing
Call this method after joining a channel.
Before calling this method, call setExternalAudioSink(enabled: true)
to notify the app to enable and set the external audio rendering.
Restrictions
- After calling this method, the app automatically pulls the audio data from the SDK. By setting the audio data parameters, the SDK adjusts the frame buffer to help the app handle latency, effectively avoiding audio playback jitter.
- After registering the onPlaybackAudioFrame callback, the SDK sends the audio data to the app through the callback. Any delay in processing the audio frames may result in audio jitter.
This method is only used for retrieving audio data after remote mixing. If you need to get audio data from different audio processing stages such as capture and playback, you can register the corresponding callbacks by calling registerAudioFrameObserver.
Returns
- The AudioFrame instance, if the method call succeeds.
- An error code, if the call fails,.
pushAudioFrame
Pushes the external audio frame.
abstract pushAudioFrame(frame: AudioFrame, trackId?: number): number;
Call this method to push external audio frames through the audio track.
Call timing
- Call createCustomAudioTrack to create a custom audio track and get the audio track ID.
- Call joinChannel to join the channel. In ChannelMediaOptions, set publishCustomAudioTrackId to the audio track ID that you want to publish, and set publishCustomAudioTrack to
true
.
Restrictions
None.
Parameters
- frame
- The external audio frame. See AudioFrame.
- trackId
- The audio track ID. If you want to publish a custom external audio source, set this parameter to the ID of the corresponding custom audio track you want to publish.
Returns
- 0: Success.
- < 0: Failure.
pushVideoFrame
Pushes the external raw video frame to the SDK through video tracks.
abstract pushVideoFrame( frame: ExternalVideoFrame, videoTrackId?: number ): number;
Details
- Call createCustomVideoTrack to create a video track and get the video track ID.
- Call joinChannel to join the channel. In ChannelMediaOptions, set customVideoTrackId to the video track ID that you want to publish, and set publishCustomVideoTrack to
true
. - Call this method and specify videoTrackId as the video track ID set in step 2. You can then publish the corresponding custom video source in the channel.
- If you no longer need to capture external video data, you can call destroyCustomVideoTrack to destroy the custom video track.
- If you only want to use the external video data for local preview and not publish it in the channel, you can call muteLocalVideoStream to cancel sending video stream or call updateChannelMediaOptions to set publishCustomVideoTrack to
false
.
Parameters
- frame
-
The external raw video frame to be pushed. See ExternalVideoFrame.
- videoTrackId
- The video track ID returned by calling the createCustomVideoTrack method. The default value is 0.
Returns
- 0: Success.
- < 0: Failure.
registerAudioFrameObserver
Registers an audio frame observer object.
abstract registerAudioFrameObserver(observer: IAudioFrameObserver): number;
Call this method to register an audio frame observer object (register a callback). When you need the SDK to trigger the onMixedAudioFrame, onRecordAudioFrame, onPlaybackAudioFrame, onPlaybackAudioFrameBeforeMixing or onEarMonitoringAudioFrame callback, you need to use this method to register the callbacks.
Call timing
Call this method before joining a channel.
Restrictions
None.
Parameters
- observer
-
The observer instance. See IAudioFrameObserver.Agora recommends calling this method after receiving onLeaveChannel to release the audio observer object.
Returns
- 0: Success.
- < 0: Failure.
registerFaceInfoObserver
Registers a facial information observer.
abstract registerFaceInfoObserver(observer: IFaceInfoObserver): number;
Details
You can call this method to register the onFaceInfo callback to receive the facial information processed by Agora speech driven extension. When calling this method to register a facial information observer, you can register callbacks in the IFaceInfoObserver class as needed. After successfully registering the facial information observer, the SDK triggers the callback you have registered when it captures the facial information converted by the speech driven extension.
- Call this method before joining a channel.
- Before calling this method, you need to make sure that the speech driven extension has been enabled by calling enableExtension.
Applicable scenarios
Facial information processed by the Agora speech driven extension is BS (Blend Shape) data that complies with ARkit standards. You can further process the BS data using third-party 3D rendering engines, such as driving avatar to make mouth movements corresponding to speech.
Parameters
- observer
- Facial information observer, see IFaceInfoObserver.
Returns
- 0: Success.
- < 0: Failure.
registerVideoEncodedFrameObserver
Registers a receiver object for the encoded video image.
abstract registerVideoEncodedFrameObserver( observer: IVideoEncodedFrameObserver ): number;
Details
If you only want to observe encoded video frames (such as H.264 format) without decoding and rendering the video, Agora recommends that you implement one IVideoEncodedFrameObserver class through this method.
Call this method before joining a channel.
Parameters
- observer
- The video frame observer object. See IVideoEncodedFrameObserver.
Returns
- 0: Success.
- < 0: Failure.
registerVideoFrameObserver
Registers a raw video frame observer object.
abstract registerVideoFrameObserver(observer: IVideoFrameObserver): number;
If you want to observe raw video frames (such as YUV or RGBA format), Agora recommends that you implement one IVideoFrameObserver class with this method.
When calling this method to register a video observer, you can register callbacks in the IVideoFrameObserver class as needed. After you successfully register the video frame observer, the SDK triggers the registered callbacks each time a video frame is received.
Applicable scenarios
After registering the raw video observer, you can use the obtained raw video data in various video pre-processing scenarios, such as virtual backgrounds and image enhacement by yourself.
Call timing
Call this method before joining a channel.
Restrictions
- When network conditions deteriorate, the video resolution decreases incrementally.
- If the user adjusts the video profile, the resolution of the video returned in the callbacks also changes.
Parameters
- observer
- The observer instance. See IVideoFrameObserver.
Returns
- 0: Success.
- < 0: Failure.
removeAllListeners
Removes all listeners for the specified event.
removeAllListeners?<EventType extends keyof IMediaEngineEvent>( eventType?: EventType ): void;
Parameters
- eventType
- The name of the target event to listen for. See IMediaEngineEvent.
removeListener
Removes the specified IMediaEngineEvent listener.
removeListener?<EventType extends keyof IMediaEngineEvent>( eventType: EventType, listener: IMediaEngineEvent[EventType] ): void;
Details
For listened events, if you no longer need to receive the callback message, you can call this method to remove the corresponding listener.
Parameters
- eventType
- The name of the target event to listen for. See IMediaEngineEvent.
- listener
- The callback function for eventType. Must pass in the same function object in addListener. Take removing the listener for onJoinChannelSuccess as an example:
// Create an onPlaybackAudioFrameBeforeMixing object const onPlaybackAudioFrameBeforeMixing = (channelId: string, uid: number, audioFrame: AudioFrame) => {}; // Add one onPlaybackAudioFrameBeforeMixing listener engine.addListener('onPlaybackAudioFrameBeforeMixing', onPlaybackAudioFrameBeforeMixing); // Remove the onPlaybackAudioFrameBeforeMixing listener engine.removeListener('onPlaybackAudioFrameBeforeMixing', onPlaybackAudioFrameBeforeMixing);
setExternalAudioSource
Sets the external audio source parameters.
abstract setExternalAudioSource( enabled: boolean, sampleRate: number, channels: number, localPlayback?: boolean, publish?: boolean ): number;
- Deprecated:
- This method is deprecated, use createCustomAudioTrack instead.
Call timing
Call this method before joining a channel.
Restrictions
None.
Parameters
- enabled
- Whether to enable the external audio source:
true
: Enable the external audio source.false
: (Default) Disable the external audio source.
- sampleRate
- The sample rate (Hz) of the external audio source which can be set as
8000
,16000
,32000
,44100
, or48000
. - channels
- The number of channels of the external audio source, which can be set as
1
(Mono) or2
(Stereo). - localPlayback
- Whether to play the external audio source:
true
: Play the external audio source.false
: (Default) Do not play the external source.
- publish
- Whether to publish audio to the remote users:
true
: (Default) Publish audio to the remote users.false
: Do not publish audio to the remote users.
Returns
- 0: Success.
- < 0: Failure.
setExternalMediaProjection
Configures MediaProjection
outside of the SDK to capture screen video streams.
abstract setExternalMediaProjection(mediaProjection: any): number;
After successfully calling this method, the external MediaProjection
you set will replace the MediaProjection
requested by the SDK to capture the screen video stream.
When the screen sharing is stopped or IRtcEngine is destroyed, the SDK will automatically release the MediaProjection.
Applicable scenarios
MediaProjection
, you can directly use your MediaProjection
instead of the one applied for by the SDK. The following lists two applicable scenarios:
- On custom system devices, it can avoid system pop-ups (such as requiring user permission to capture the screen) and directly start capturing the screen video stream.
- In a screen sharing process that involves one or more sub-processes, it can help avoid errors that might occur when creating objects within these sub-processes, which could otherwise lead to failures in screen capturing.
Call timing
Call this method after startScreenCapture.
Restrictions
Before calling this method, you must first apply for MediaProjection
permission.
Parameters
- mediaProjection
- An object used to capture screen video streams.
Returns
- 0: Success.
- < 0: Failure.
setExternalRemoteEglContext
Sets the EGL context for rendering remote video streams.
abstract setExternalRemoteEglContext(eglContext: any): number;
This method can replace the default remote EGL context within the SDK, making it easier to manage the EGL context.
When the engine is destroyed, the SDK will automatically release the EGL context.
Applicable scenarios
This method is suitable for using a custom video rendering method instead of the default SDK rendering method to render remote video frames in Texture format.
Call timing
Call this method before joining a channel.
Restrictions
None.
Parameters
- eglContext
- The EGL context for rendering remote video streams.
Returns
- 0: Success.
- < 0: Failure.
setExternalVideoSource
Configures the external video source.
abstract setExternalVideoSource( enabled: boolean, useTexture: boolean, sourceType?: ExternalVideoSourceType, encodedVideoOption?: SenderOptions ): number;
After calling this method to enable an external video source, you can call pushVideoFrame to push external video data to the SDK.
Call timing
Call this method before joining a channel.
Restrictions
Dynamic switching of video sources is not supported within the channel. To switch from an external video source to an internal video source, you must first leave the channel, call this method to disable the external video source, and then rejoin the channel.
Parameters
- enabled
- Whether to use the external video source:
true
: Use the external video source. The SDK prepares to accept the external video frame.false
: (Default) Do not use the external video source.
- useTexture
- Whether to use the external video frame in the Texture format.
true
: Use the external video frame in the Texture format.false
: (Default) Do not use the external video frame in the Texture format.
- sourceType
- Whether the external video frame is encoded. See ExternalVideoSourceType.
- encodedVideoOption
- Video encoding options. This parameter needs to be set if sourceType is EncodedVideoFrame. To set this parameter, contact technical support.
Returns
- 0: Success.
- < 0: Failure.
unregisterAudioFrameObserver
Unregisters an audio frame observer.
abstract unregisterAudioFrameObserver(observer: IAudioFrameObserver): number;
Parameters
- observer
- The audio frame observer, reporting the reception of each audio frame. See IAudioFrameObserver.
Returns
- 0: Success.
- < 0: Failure.
unregisterFaceInfoObserver
Unregisters a facial information observer.
abstract unregisterFaceInfoObserver(observer: IFaceInfoObserver): number;
Details
Parameters
- observer
- Facial information observer, see IFaceInfoObserver.
Returns
- 0: Success.
- < 0: Failure.
unregisterVideoEncodedFrameObserver
Unregisters a receiver object for the encoded video frame.
abstract unregisterVideoEncodedFrameObserver( observer: IVideoEncodedFrameObserver ): number;
Parameters
- observer
- The video observer, reporting the reception of each video frame. See IVideoEncodedFrameObserver.
Returns
- 0: Success.
- < 0: Failure.
unregisterVideoFrameObserver
Unregisters the video frame observer.
abstract unregisterVideoFrameObserver(observer: IVideoFrameObserver): number;
Parameters
- observer
- The video observer, reporting the reception of each video frame. See IVideoFrameObserver.
Returns
- 0: Success.
- < 0: Failure.