The current state of audio processing, such as start, pause, or stop.
The total duration of the audio (seconds).
Whether a media track is playing on the webpage:
true
: The media track is playing on the webpage.false
: The media track is not playing on the webpage.
4.18.0
The playback speed of the current audio file. Valid range is [50, 400], where:
50
: Half the original speed.100
: (Default) The original speed.400
: Four times the original speed.
4.10.0
The destination of the current processing pipeline on the local audio track.
The [source]BufferSourceAudioTrackInitConfig.source specified when creating an audio track.
The type of a media track:
"audio"
: Audio track."video"
: Video track.Gets an MediaStreamTrack object.
An MediaStreamTrack object.
Gets the RTCRtpTransceiver instance of the current track.
This method is currently mainly used for end-to-end encryption of video streams (Beta).
If the SDK experiences a reconnection, the
RTCRtpTransceiver
instance corresponding to the current track might change. You can obtain the newRTCRtpTransceiver
instance through the following callbacks:
- For a local track: [ILocalTrack.transceiver-updated]event_transceiver_updated
- For a remote track: [IRemoteTrack.transceiver-updated]event_transceiver_updated_2
Optional
type: StreamTypeThe type of the video stream. See StreamType.
The RTCRtpTransceiver instance of the current track.
Gets the statistics of a local audio track.
from v4.1.0. Use [AgoraRTCClient.getLocalVideoStats]IAgoraRTCClient.getLocalVideoStats and [AgoraRTCClient.getLocalAudioStats]IAgoraRTCClient.getLocalAudioStats instead.
Gets the label of a local track.
The label that the SDK returns may include:
createMicrophoneAudioTrack
or createCameraVideoTrack
.sourceId
property, if the track is created by calling createScreenVideoTrack
.createCustomAudioTrack
or createCustomVideoTrack
.The event name.
See [source-state-change]event_source_state_change.
Occurs when the state of processing the audio buffer in [BufferSourceAudioTrack]IBufferSourceAudioTrack changes.
The state of processing the audio buffer:
"stopped"
: The SDK stops processing the audio buffer. Reasons may include:"paused"
: The SDK pauses the processing of the audio buffer."playing"
: The SDK is processing the audio buffer.IBufferSourceAudioTrack
Events
When the specified event happens, the SDK triggers the callback that you pass.
The event name.
The callback function.
The playback speed. Valid range is [50, 400], where:
50
: Half the original speed.100
: (Default) The original speed.400
: Four times the original speed.
4.18.0
Sets the playback speed for the current audio file.
You can call this method before or after joining a channel.
Sets the callback for getting raw audio data in PCM format.
After you successfully set the callback, the SDK constantly returns the audio frames of a local audio track in this callback by using AudioBuffer.
You can set the
frameSize
parameter to determine the frame size in each callback, which affects the interval between the callbacks. The larger the frame size, the longer the interval between them.
track.setAudioFrameCallback((buffer) => {
for (let channel = 0; channel < buffer.numberOfChannels; channel += 1) {
// Float32Array with PCM data
const currentChannelData = buffer.getChannelData(channel);
console.log("PCM data in channel", channel, currentChannelData);
}
}, 2048);
// ....
// Stop getting the raw audio data
track.setAudioFrameCallback(null);
The callback function for receiving the AudioBuffer object. If you set audioBufferCallback
as null
, the SDK stops getting raw audio data.
Optional
frameSize: numberThe number of samples of each audio channel that an AudioBuffer
object contains. You can set frameSize
as 256, 512, 1024, 2048, 4096, 8192, or 16384. The default value is 4096.
Whether to enable the track:
true
: Enable the track.false
: Disable the track.
4.0.0
Enables/Disables the track.
After a track is disabled, the SDK stops playing and publishing the track.
- Disabling a track does not trigger the [LocalTrack.on("track-ended")]event_track_ended event.
- If a track is published, disabling this track triggers the [user-unpublished]IAgoraRTCClient.event_user_unpublished event on the remote client, and re-enabling this track triggers the [user-published]IAgoraRTCClient.event_user_published event.
- Do not call
setEnabled
andsetMuted
together.
Sends or stops sending the media data of the track.
Whether to stop sending the media data of the track:
true
: Stop sending the media data of the track.false
: Resume sending the media data of the track.
4.6.0
If the track is published, a successful call of setMuted(true)
triggers the [user-unpublished]IAgoraRTCClient.event_user_unpublished event on the remote client, and a successful call of setMuted(false)
triggers the [user-published]IAgoraRTCClient.event_user_published event.
- Calling
setMuted(true)
does not stop capturing audio or video and takes shorter time to take effect than [[setEnabled]]. For details, see What are the differences between setEnabled and setMuted?.- Do not call
setEnabled
andsetMuted
together.
The device ID, which can be retrieved by calling [[getPlaybackDevices]].
4.1.0
Note:
- As of v4.7.0, this method no longer takes effect. Use [IRemoteAudioTrack.setPlaybackDevice]IRemoteAudioTrack.setPlaybackDevice instead.
Sets the playback device (speaker) for the remote audio stream.
Sets the volume of a local audio track.
The volume. The value ranges from 0 (mute) to 1000 (maximum). A value of 100 is the original volume。 The volume change may not be obvious to human ear. If local track has been published, setting volume will affect the volume heard by remote users.
Starts processing the audio buffer.
Starting processing the audio buffer means that the processing unit in the SDK has received the audio data. If the audio track has been published, the remote user can hear the audio. Whether the local user can hear the audio depends on whether the SDK calls the [[play]] method and sends the audio data to the sound card.
Optional
options: AudioSourceOptionsOptions for processing the audio buffer. See [[AudioSourceOptions]].
Inherited from [LocalAudioTrack]ILocalAudioTrack,
BufferSourceAudioTrack
is an interface for the audio from a local audio file and adds several functions for controlling the processing of the audio buffer, such as starting processing, stopping processing, and seeking a specified time location.You can create an audio track from an audio file by calling [AgoraRTC.createBufferSourceAudioTrack]IAgoraRTC.createBufferSourceAudioTrack.