OfflineAudioContext

The OfflineAudioContext interface is an AudioContext interface representing an audio-processing graph built from linked together AudioNodes.

MDN Reference

Constructors

Link copied to clipboard
constructor(numberOfChannels: Int, length: Int, sampleRate: Float)
constructor(contextOptions: OfflineAudioContextOptions)

Properties

Link copied to clipboard

The audioWorklet read-only property of the processing. Available only in secure contexts.

Link copied to clipboard

The currentTime read-only property of the BaseAudioContext interface returns a double representing an ever-increasing hardware timestamp in seconds that can be used for scheduling audio playback, visualizing timelines, etc.

Link copied to clipboard

The destination property of the BaseAudioContext interface returns an AudioDestinationNode representing the final destination of all audio in the context.

Link copied to clipboard
val length: Int

The length property of the the buffer in sample-frames.

Link copied to clipboard

The listener property of the BaseAudioContext interface returns an AudioListener object that can then be used for implementing 3D audio spatialization.

Link copied to clipboard

The sampleRate property of the BaseAudioContext interface returns a floating point number representing the sample rate, in samples per second, used by all nodes in this audio context.

Link copied to clipboard

The state read-only property of the BaseAudioContext interface returns the current state of the AudioContext.

Functions

Link copied to clipboard

The createAnalyser() method of the can be used to expose audio time and frequency data and create data visualizations.

Link copied to clipboard

The createBiquadFilter() method of the BaseAudioContext interface creates a BiquadFilterNode, which represents a second order filter configurable as several different common filter types.

Link copied to clipboard
fun createBuffer(numberOfChannels: Int, length: Int, sampleRate: Float): AudioBuffer

The createBuffer() method of the BaseAudioContext Interface is used to create a new, empty AudioBuffer object, which can then be populated by data, and played via an AudioBufferSourceNode.

Link copied to clipboard

The createBufferSource() method of the BaseAudioContext Interface is used to create a new AudioBufferSourceNode, which can be used to play audio data contained within an AudioBuffer object.

Link copied to clipboard
fun createChannelMerger(numberOfInputs: Int = definedExternally): ChannelMergerNode

The createChannelMerger() method of the BaseAudioContext interface creates a ChannelMergerNode, which combines channels from multiple audio streams into a single audio stream.

Link copied to clipboard
fun createChannelSplitter(numberOfOutputs: Int = definedExternally): ChannelSplitterNode

The createChannelSplitter() method of the BaseAudioContext Interface is used to create a ChannelSplitterNode, which is used to access the individual channels of an audio stream and process them separately.

Link copied to clipboard

The createConstantSource() property of the BaseAudioContext interface creates a outputs a monaural (one-channel) sound signal whose samples all have the same value.

Link copied to clipboard

The createConvolver() method of the BaseAudioContext interface creates a ConvolverNode, which is commonly used to apply reverb effects to your audio.

Link copied to clipboard
fun createDelay(maxDelayTime: Double = definedExternally): DelayNode

The createDelay() method of the which is used to delay the incoming audio signal by a certain amount of time.

Link copied to clipboard

The createDynamicsCompressor() method of the BaseAudioContext Interface is used to create a DynamicsCompressorNode, which can be used to apply compression to an audio signal.

Link copied to clipboard

The createGain() method of the BaseAudioContext interface creates a GainNode, which can be used to control the overall gain (or volume) of the audio graph.

Link copied to clipboard

The createIIRFilter() method of the BaseAudioContext interface creates an IIRFilterNode, which represents a general infinite impulse response (IIR) filter which can be configured to serve as various types of filter.

Link copied to clipboard

The createOscillator() method of the BaseAudioContext interface creates an OscillatorNode, a source representing a periodic waveform.

Link copied to clipboard

The createPanner() method of the BaseAudioContext Interface is used to create a new PannerNode, which is used to spatialize an incoming audio stream in 3D space.

Link copied to clipboard
fun createPeriodicWave(    real: ReadonlyArray<JsDouble>,     imag: ReadonlyArray<JsDouble>,     constraints: PeriodicWaveConstraints = definedExternally): PeriodicWave

The createPeriodicWave() method of the BaseAudioContext interface is used to create a PeriodicWave.

fun createPeriodicWave(real: Float32Array<*>, imag: Float32Array<*>, constraints: PeriodicWaveConstraints = definedExternally): PeriodicWave
Link copied to clipboard

The createStereoPanner() method of the BaseAudioContext interface creates a StereoPannerNode, which can be used to apply stereo panning to an audio source.

Link copied to clipboard

The createWaveShaper() method of the BaseAudioContext interface creates a WaveShaperNode, which represents a non-linear distortion.

Link copied to clipboard
suspend fun decodeAudioData(    audioData: ArrayBuffer,     successCallback: DecodeSuccessCallback? = definedExternally,     errorCallback: DecodeErrorCallback? = definedExternally): AudioBuffer

The decodeAudioData() method of the BaseAudioContext Interface is used to asynchronously decode audio file data contained in an rate, then passed to a callback or promise.

Link copied to clipboard
fun decodeAudioDataAsync(    audioData: ArrayBuffer,     successCallback: DecodeSuccessCallback? = definedExternally,     errorCallback: DecodeErrorCallback? = definedExternally): Promise<AudioBuffer>
Link copied to clipboard
Link copied to clipboard
suspend fun resume()

The resume() method of the context that has been suspended.

Link copied to clipboard
Link copied to clipboard

The startRendering() method of the OfflineAudioContext Interface starts rendering the audio graph, taking into account the current connections and the current scheduled changes.

Link copied to clipboard
Link copied to clipboard
suspend fun suspend(suspendTime: Double)

The suspend() method of the OfflineAudioContext interface schedules a suspension of the time progression in the audio context at the specified time and returns a promise.

Link copied to clipboard
fun suspendAsync(suspendTime: Double): Promise<Void>
Link copied to clipboard