Interface IBufferSourceAudioTrack

Inherited from [LocalAudioTrack]ILocalAudioTrack, BufferSourceAudioTrack is an interface for the audio from a local audio file and adds several functions for controlling the processing of the audio buffer, such as starting processing, stopping processing, and seeking a specified time location.

You can create an audio track from an audio file by calling [AgoraRTC.createBufferSourceAudioTrack]IAgoraRTC.createBufferSourceAudioTrack.

interface IBufferSourceAudioTrack {
    currentState: AudioSourceState;
    duration: number;
    enabled: boolean;
    isPlaying: boolean;
    muted: boolean;
    playbackSpeed: number;
    processorDestination: IAudioProcessor;
    source: null | string | File | AudioBuffer;
    trackMediaType: "audio" | "video";
    close(): void;
    getCurrentTime(): number;
    getListeners(event): Function[];
    getMediaStreamTrack(): MediaStreamTrack;
    getRTCRtpTransceiver(type?): undefined | RTCRtpTransceiver;
    getStats(): LocalAudioTrackStats;
    getTrackId(): string;
    getTrackLabel(): string;
    getVolumeLevel(): number;
    off(event, listener): void;
    on(event, listener): void;
    on(event, listener): void;
    once(event, listener): void;
    pauseProcessAudioBuffer(): void;
    pipe(processor): IAudioProcessor;
    play(): void;
    removeAllListeners(event?): void;
    resumeProcessAudioBuffer(): void;
    seekAudioBuffer(time): void;
    setAudioBufferPlaybackSpeed(speed): void;
    setAudioFrameCallback(audioFrameCallback, frameSize?): void;
    setEnabled(enabled): Promise<void>;
    setMuted(muted): Promise<void>;
    setPlaybackDevice(deviceId): Promise<void>;
    setVolume(volume): void;
    startProcessAudioBuffer(options?): void;
    stop(): void;
    stopProcessAudioBuffer(): void;
    unpipe(): void;
}

Hierarchy (view full)

Properties

currentState: AudioSourceState

The current state of audio processing, such as start, pause, or stop.

duration: number

The total duration of the audio (seconds).

enabled: boolean
isPlaying: boolean

Whether a media track is playing on the webpage:

  • true: The media track is playing on the webpage.
  • false: The media track is not playing on the webpage.
muted: boolean
playbackSpeed: number

Since


   4.18.0

The playback speed of the current audio file. Valid range is [50, 400], where:

  • 50: Half the original speed.
  • 100: (Default) The original speed.
  • 400: Four times the original speed.
processorDestination: IAudioProcessor

Since


   4.10.0

The destination of the current processing pipeline on the local audio track.

source: null | string | File | AudioBuffer

The [source]BufferSourceAudioTrackInitConfig.source specified when creating an audio track.

trackMediaType: "audio" | "video"

The type of a media track:

  • "audio": Audio track.
  • "video": Video track.

Methods

  • Closes a local track and releases the audio and video resources that it occupies.

    Once you close a local track, you can no longer reuse it.

    Returns void

  • Gets the progress (seconds) of the audio buffer processing.

    Returns number

    The progress (seconds) of the audio buffer processing.

  • Gets all the listeners for a specified event.

    Parameters

    • event: string

      The event name.

    Returns Function[]

  • Gets the RTCRtpTransceiver instance of the current track.

    This method is currently mainly used for end-to-end encryption of video streams (Beta).

    If the SDK experiences a reconnection, the RTCRtpTransceiver instance corresponding to the current track might change. You can obtain the new RTCRtpTransceiver instance through the following callbacks:

    • For a local track: [ILocalTrack.transceiver-updated]event_transceiver_updated
    • For a remote track: [IRemoteTrack.transceiver-updated]event_transceiver_updated_2

    Parameters

    • Optional type: StreamType

      The type of the video stream. See StreamType.

    Returns undefined | RTCRtpTransceiver

    The RTCRtpTransceiver instance of the current track.

  • Gets the ID of a media track, a unique identifier generated by the SDK.

    Returns string

    The media track ID.

  • Gets the label of a local track.

    Returns string

    The label that the SDK returns may include:

    • The MediaDeviceInfo.label property, if the track is created by calling createMicrophoneAudioTrack or createCameraVideoTrack.
    • The sourceId property, if the track is created by calling createScreenVideoTrack.
    • The MediaStreamTrack.label property, if the track is created by calling createCustomAudioTrack or createCustomVideoTrack.
  • Gets the audio level of a local audio track.

    Returns number

    The audio level. The value range is [0,1]. 1 is the highest audio level. Usually a user with audio level above 0.6 is a speaking user.

  • Removes the listener for a specified event.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback that corresponds to the event listener.

    Returns void

  • Parameters

    • event: "source-state-change"

      The event name.

    • listener: ((currentState) => void)

      See [source-state-change]event_source_state_change.

        • (currentState): void
        • Occurs when the state of processing the audio buffer in [BufferSourceAudioTrack]IBufferSourceAudioTrack changes.

          Parameters

          • currentState: AudioSourceState

            The state of processing the audio buffer:

            • "stopped": The SDK stops processing the audio buffer. Reasons may include:
            • The SDK finishes processing the audio buffer.
            • The user manually stops the processing of the audio buffer.
            • "paused": The SDK pauses the processing of the audio buffer.
            • "playing": The SDK is processing the audio buffer.

          Returns void

          As Member Of

          IBufferSourceAudioTrack

          Group

          Events

    Returns void

  • When the specified event happens, the SDK triggers the callback that you pass.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback function.

    Returns void

  • Listens for a specified event once.

    When the specified event happens, the SDK triggers the callback that you pass and then removes the listener.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback to trigger.

    Returns void

  • Pauses processing the audio buffer.

    Returns void

  • Inserts a Processor to the local audio track.

    Parameters

    • processor: IAudioProcessor

      The Processor instance. Each extension has a corresponding type of Processor.

    Returns IAudioProcessor

    The Processor instance.

  • Plays a local audio track.

    When playing a audio track, you do not need to pass any DOM element.

    Returns void

  • Removes all listeners for a specified event.

    Parameters

    • Optional event: string

      The event name. If left empty, all listeners for all events are removed.

    Returns void

  • Resumes processing the audio buffer.

    Returns void

  • Jumps to a specified time point.

    Parameters

    • time: number

      The specified time point (seconds).

    Returns void

  • Parameters

    • speed: number

      The playback speed. Valid range is [50, 400], where:

      • 50: Half the original speed.
      • 100: (Default) The original speed.
      • 400: Four times the original speed.

    Returns void

    Since


       4.18.0

    Sets the playback speed for the current audio file.

    You can call this method before or after joining a channel.

  • Sets the callback for getting raw audio data in PCM format.

    After you successfully set the callback, the SDK constantly returns the audio frames of a local audio track in this callback by using AudioBuffer.

    You can set the frameSize parameter to determine the frame size in each callback, which affects the interval between the callbacks. The larger the frame size, the longer the interval between them.

    track.setAudioFrameCallback((buffer) => {
    for (let channel = 0; channel < buffer.numberOfChannels; channel += 1) {
    // Float32Array with PCM data
    const currentChannelData = buffer.getChannelData(channel);
    console.log("PCM data in channel", channel, currentChannelData);
    }
    }, 2048);

    // ....
    // Stop getting the raw audio data
    track.setAudioFrameCallback(null);

    Parameters

    • audioFrameCallback: null | ((buffer) => void)

      The callback function for receiving the AudioBuffer object. If you set audioBufferCallback as null, the SDK stops getting raw audio data.

    • Optional frameSize: number

      The number of samples of each audio channel that an AudioBuffer object contains. You can set frameSize as 256, 512, 1024, 2048, 4096, 8192, or 16384. The default value is 4096.

    Returns void

  • Parameters

    • enabled: boolean

      Whether to enable the track:

      • true: Enable the track.
      • false: Disable the track.

    Returns Promise<void>

    Since


       4.0.0

    Enables/Disables the track.

    After a track is disabled, the SDK stops playing and publishing the track.

    • Disabling a track does not trigger the [LocalTrack.on("track-ended")]event_track_ended event.
    • If a track is published, disabling this track triggers the [user-unpublished]IAgoraRTCClient.event_user_unpublished event on the remote client, and re-enabling this track triggers the [user-published]IAgoraRTCClient.event_user_published event.
    • Do not call setEnabled and setMuted together.
  • Sends or stops sending the media data of the track.

    Parameters

    • muted: boolean

      Whether to stop sending the media data of the track:

      • true: Stop sending the media data of the track.
      • false: Resume sending the media data of the track.

    Returns Promise<void>

    Since


       4.6.0

    If the track is published, a successful call of setMuted(true) triggers the [user-unpublished]IAgoraRTCClient.event_user_unpublished event on the remote client, and a successful call of setMuted(false) triggers the [user-published]IAgoraRTCClient.event_user_published event.

  • Parameters

    • deviceId: string

      The device ID, which can be retrieved by calling [[getPlaybackDevices]].

    Returns Promise<void>

    Since


       4.1.0

    Note:

    Sets the playback device (speaker) for the remote audio stream.

  • Sets the volume of a local audio track.

    Parameters

    • volume: number

      The volume. The value ranges from 0 (mute) to 1000 (maximum). A value of 100 is the original volume。 The volume change may not be obvious to human ear. If local track has been published, setting volume will affect the volume heard by remote users.

    Returns void

  • Starts processing the audio buffer.

    Starting processing the audio buffer means that the processing unit in the SDK has received the audio data. If the audio track has been published, the remote user can hear the audio. Whether the local user can hear the audio depends on whether the SDK calls the [[play]] method and sends the audio data to the sound card.

    Parameters

    • Optional options: AudioSourceOptions

      Options for processing the audio buffer. See [[AudioSourceOptions]].

    Returns void

  • Stops playing the media track.

    Returns void

  • Stops processing the audio buffer.

    Returns void

  • Returns void

    Since


       4.10.0

    Removes the Processor inserted to the local audio track.