Interface IBufferSourceAudioTrack

Inherited from LocalAudioTrack, BufferSourceAudioTrack is an interface for the audio from a local audio file and adds several functions for controlling the processing of the audio buffer, such as starting processing, stopping processing, and seeking a specified time location.

You can create an audio track from an audio file by calling AgoraRTC.createBufferSourceAudioTrack.

Hierarchy

Index

Events

source-state-change

  • Occurs when the state of processing the audio buffer in BufferSourceAudioTrack changes.

    Parameters

    • currentState: AudioSourceState

      The state of processing the audio buffer:

      • "stopped": The SDK stops processing the audio buffer. Reasons may include:
      • The SDK finishes processing the audio buffer.
      • The user manually stops the processing of the audio buffer.
      • "paused": The SDK pauses the processing of the audio buffer.
      • "playing": The SDK is processing the audio buffer.

    Returns void

Properties

currentState

currentState: AudioSourceState

The current state of audio processing, such as start, pause, or stop.

duration

duration: number

The total duration of the audio (seconds).

enabled

enabled: boolean

isPlaying

isPlaying: boolean

Whether a media track is playing on the webpage:

  • true: The media track is playing on the webpage.
  • false: The media track is not playing on the webpage.

muted

muted: boolean

playbackSpeed

playbackSpeed: number
since


   4.18.0

The playback speed of the current audio file. Valid range is [50, 400], where:

  • 50: Half the original speed.
  • 100: (Default) The original speed.
  • 400: Four times the original speed.

processorDestination

processorDestination: IAudioProcessor
since


   4.10.0

The destination of the current processing pipeline on the local audio track.

source

source: string | File | AudioBuffer | null

The source specified when creating an audio track.

trackMediaType

trackMediaType: "audio" | "video"

The type of a media track:

  • "audio": Audio track.
  • "video": Video track.

Methods

close

  • close(): void
  • Closes a local track and releases the audio and video resources that it occupies.

    Once you close a local track, you can no longer reuse it.

    Returns void

getCurrentTime

  • getCurrentTime(): number
  • Gets the progress (seconds) of the audio buffer processing.

    Returns number

    The progress (seconds) of the audio buffer processing.

getListeners

  • getListeners(event: string): Function[]
  • Gets all the listeners for a specified event.

    Parameters

    • event: string

      The event name.

    Returns Function[]

getMediaStreamTrack

  • getMediaStreamTrack(): MediaStreamTrack

getRTCRtpTransceiver

  • getRTCRtpTransceiver(type?: StreamType): RTCRtpTransceiver | undefined

getStats

getTrackId

  • getTrackId(): string

getTrackLabel

  • getTrackLabel(): string
  • Gets the label of a local track.

    Returns string

    The label that the SDK returns may include:

    • The MediaDeviceInfo.label property, if the track is created by calling createMicrophoneAudioTrack or createCameraVideoTrack.
    • The sourceId property, if the track is created by calling createScreenVideoTrack.
    • The MediaStreamTrack.label property, if the track is created by calling createCustomAudioTrack or createCustomVideoTrack.

getVolumeLevel

  • getVolumeLevel(): number
  • Gets the audio level of a local audio track.

    Returns number

    The audio level. The value range is [0,1]. 1 is the highest audio level. Usually a user with audio level above 0.6 is a speaking user.

off

  • off(event: string, listener: Function): void
  • Removes the listener for a specified event.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback that corresponds to the event listener.

    Returns void

on

  • on(event: "source-state-change", listener: typeof event_source_state_change): void
  • on(event: string, listener: Function): void
  • Parameters

    • event: "source-state-change"

      The event name.

    • listener: typeof event_source_state_change

    Returns void

  • When the specified event happens, the SDK triggers the callback that you pass.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback function.

    Returns void

once

  • once(event: string, listener: Function): void
  • Listens for a specified event once.

    When the specified event happens, the SDK triggers the callback that you pass and then removes the listener.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback to trigger.

    Returns void

pauseProcessAudioBuffer

  • pauseProcessAudioBuffer(): void
  • Pauses processing the audio buffer.

    Returns void

pipe

  • pipe(processor: IAudioProcessor): IAudioProcessor
  • Inserts a Processor to the local audio track.

    Parameters

    • processor: IAudioProcessor

      The Processor instance. Each extension has a corresponding type of Processor.

    Returns IAudioProcessor

    The Processor instance.

play

  • play(): void

removeAllListeners

  • removeAllListeners(event?: string): void
  • Removes all listeners for a specified event.

    Parameters

    • Optional event: string

      The event name. If left empty, all listeners for all events are removed.

    Returns void

resumeProcessAudioBuffer

  • resumeProcessAudioBuffer(): void
  • Resumes processing the audio buffer.

    Returns void

seekAudioBuffer

  • seekAudioBuffer(time: number): void
  • Jumps to a specified time point.

    Parameters

    • time: number

      The specified time point (seconds).

    Returns void

setAudioBufferPlaybackSpeed

  • setAudioBufferPlaybackSpeed(speed: number): void
  • since


       4.18.0

    Sets the playback speed for the current audio file.

    You can call this method before or after joining a channel.

    Parameters

    • speed: number

      The playback speed. Valid range is [50, 400], where:

      • 50: Half the original speed.
      • 100: (Default) The original speed.
      • 400: Four times the original speed.

    Returns void

setAudioFrameCallback

  • setAudioFrameCallback(audioFrameCallback: null | function, frameSize?: number): void
  • Sets the callback for getting raw audio data in PCM format.

    After you successfully set the callback, the SDK constantly returns the audio frames of a local audio track in this callback by using AudioBuffer.

    You can set the frameSize parameter to determine the frame size in each callback, which affects the interval between the callbacks. The larger the frame size, the longer the interval between them.

    track.setAudioFrameCallback((buffer) => {
      for (let channel = 0; channel < buffer.numberOfChannels; channel += 1) {
        // Float32Array with PCM data
        const currentChannelData = buffer.getChannelData(channel);
        console.log("PCM data in channel", channel, currentChannelData);
      }
    }, 2048);
    
    // ....
    // Stop getting the raw audio data
    track.setAudioFrameCallback(null);
    

    Parameters

    • audioFrameCallback: null | function

      The callback function for receiving the AudioBuffer object. If you set audioBufferCallback as null, the SDK stops getting raw audio data.

    • Optional frameSize: number

      The number of samples of each audio channel that an AudioBuffer object contains. You can set frameSize as 256, 512, 1024, 2048, 4096, 8192, or 16384. The default value is 4096.

    Returns void

setEnabled

  • setEnabled(enabled: boolean): Promise<void>
  • since


       4.0.0

    Enables/Disables the track.

    After a track is disabled, the SDK stops playing and publishing the track.

    Parameters

    • enabled: boolean

      Whether to enable the track:

      • true: Enable the track.
      • false: Disable the track.

    Returns Promise<void>

setMuted

  • setMuted(muted: boolean): Promise<void>
  • Sends or stops sending the media data of the track.

    since


       4.6.0

    If the track is published, a successful call of setMuted(true) triggers the user-unpublished event on the remote client, and a successful call of setMuted(false) triggers the user-published event.

    Parameters

    • muted: boolean

      Whether to stop sending the media data of the track:

      • true: Stop sending the media data of the track.
      • false: Resume sending the media data of the track.

    Returns Promise<void>

setPlaybackDevice

  • setPlaybackDevice(deviceId: string): Promise<void>

setVolume

  • setVolume(volume: number): void
  • Sets the volume of a local audio track.

    Parameters

    • volume: number

      The volume. The value ranges from 0 (mute) to 1000 (maximum). A value of 100 is the original volume。 The volume change may not be obvious to human ear. If local track has been published, setting volume will affect the volume heard by remote users.

    Returns void

startProcessAudioBuffer

  • Starts processing the audio buffer.

    Starting processing the audio buffer means that the processing unit in the SDK has received the audio data. If the audio track has been published, the remote user can hear the audio. Whether the local user can hear the audio depends on whether the SDK calls the play method and sends the audio data to the sound card.

    Parameters

    Returns void

stop

  • stop(): void

stopProcessAudioBuffer

  • stopProcessAudioBuffer(): void
  • Stops processing the audio buffer.

    Returns void

unpipe

  • unpipe(): void