Package-level declarations

Types

Link copied to clipboard
external class AnalyserNode(context: BaseAudioContext, options: AnalyserOptions = definedExternally) : AudioNode

A node able to provide real-time frequency and time-domain analysis information. It is an AudioNode that passes the audio stream unchanged from the input to the output, but allows you to take the generated data, process it, and create audio visualizations.

Link copied to clipboard
external interface AnalyserOptions : AudioNodeOptions
Link copied to clipboard
external class AudioBuffer(options: AudioBufferOptions)

A short audio asset residing in memory, created from an audio file using the AudioContext.decodeAudioData() method, or from raw data using AudioContext.createBuffer(). Once put into an AudioBuffer, the audio can then be played by being passed into an AudioBufferSourceNode.

Link copied to clipboard
external interface AudioBufferOptions
Link copied to clipboard
external class AudioBufferSourceNode(context: BaseAudioContext, options: AudioBufferSourceOptions = definedExternally) : AudioScheduledSourceNode

An AudioScheduledSourceNode which represents an audio source consisting of in-memory audio data, stored in an AudioBuffer. It's especially useful for playing back audio which has particularly stringent timing accuracy requirements, such as for sounds that must match a specific rhythm and can be kept in memory rather than being played from disk or the network.

Link copied to clipboard
external interface AudioBufferSourceOptions
Link copied to clipboard
open external class AudioContext(contextOptions: AudioContextOptions = definedExternally) : BaseAudioContext

An audio-processing graph built from audio modules linked together, each represented by an AudioNode.

Link copied to clipboard
sealed external interface AudioContextLatencyCategory
Link copied to clipboard
external interface AudioContextOptions
Link copied to clipboard
sealed external interface AudioContextState
Link copied to clipboard
sealed external class AudioDestinationNode : AudioNode

AudioDestinationNode has no output (as it is the output, no more AudioNode can be linked after it in the audio graph) and one input. The number of channels in the input must be between 0 and the maxChannelCount value or an exception is raised.

Link copied to clipboard
sealed external class AudioListener

The position and orientation of the unique person listening to the audio scene, and is used in audio spatialization. All PannerNodes spatialize in relation to the AudioListener stored in the BaseAudioContext.listener attribute.

Link copied to clipboard
sealed external class AudioNode : EventTarget

A generic interface for representing an audio processing module. Examples include:

Link copied to clipboard
external interface AudioNodeOptions
Link copied to clipboard
sealed external class AudioParam

The Web Audio API's AudioParam interface represents an audio-related parameter, usually a parameter of an AudioNode (such as GainNode.gain).

Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
external interface AudioTimestamp
Link copied to clipboard
sealed external class AudioWorklet : Worklet

Available only in secure contexts.

Link copied to clipboard
Link copied to clipboard
open external class AudioWorkletNode(context: BaseAudioContext, name: String, options: AudioWorkletNodeOptions = definedExternally) : AudioNode

Available only in secure contexts.

Link copied to clipboard
Link copied to clipboard
abstract external class AudioWorkletProcessor
Link copied to clipboard
Link copied to clipboard
sealed external interface AutomationRate
Link copied to clipboard
Link copied to clipboard
external class BiquadFilterNode(context: BaseAudioContext, options: BiquadFilterOptions = definedExternally) : AudioNode

A simple low-order filter, and is created using the AudioContext.createBiquadFilter() method. It is an AudioNode that can represent different kinds of filters, tone control devices, and graphic equalizers.

Link copied to clipboard
Link copied to clipboard
sealed external interface BiquadFilterType
Link copied to clipboard
sealed external interface ChannelCountMode
Link copied to clipboard
sealed external interface ChannelInterpretation
Link copied to clipboard
external class ChannelMergerNode(context: BaseAudioContext, options: ChannelMergerOptions = definedExternally) : AudioNode

The ChannelMergerNode interface, often used in conjunction with its opposite, ChannelSplitterNode, reunites different mono inputs into a single output. Each input is used to fill a channel of the output. This is useful for accessing each channels separately, e.g. for performing channel mixing where gain must be separately controlled on each channel.

Link copied to clipboard
Link copied to clipboard
external class ChannelSplitterNode(context: BaseAudioContext, options: ChannelSplitterOptions = definedExternally) : AudioNode

The ChannelSplitterNode interface, often used in conjunction with its opposite, ChannelMergerNode, separates the different channels of an audio source into a set of mono outputs. This is useful for accessing each channel separately, e.g. for performing channel mixing where gain must be separately controlled on each channel.

Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
external interface ConstantSourceOptions
Link copied to clipboard
external class ConvolverNode(context: BaseAudioContext, options: ConvolverOptions = definedExternally) : AudioNode

An AudioNode that performs a Linear Convolution on a given AudioBuffer, often used to achieve a reverb effect. A ConvolverNode always has exactly one input and one output.

Link copied to clipboard
external interface ConvolverOptions : AudioNodeOptions
Link copied to clipboard
typealias DecodeErrorCallback = (error: DOMException) -> Unit
Link copied to clipboard
typealias DecodeSuccessCallback = (decodedData: AudioBuffer) -> Unit
Link copied to clipboard
external class DelayNode(context: BaseAudioContext, options: DelayOptions = definedExternally) : AudioNode

A delay-line; an AudioNode audio-processing module that causes a delay between the arrival of an input data and its propagation to the output.

Link copied to clipboard
external interface DelayOptions : AudioNodeOptions
Link copied to clipboard
sealed external interface DistanceModelType
Link copied to clipboard
external class DynamicsCompressorNode(context: BaseAudioContext, options: DynamicsCompressorOptions = definedExternally) : AudioNode

Inherits properties from its parent, AudioNode.

Link copied to clipboard
Link copied to clipboard
external class GainNode(context: BaseAudioContext, options: GainOptions = definedExternally) : AudioNode

A change in volume. It is an AudioNode audio-processing module that causes a given gain to be applied to the input data before its propagation to the output. A GainNode always has exactly one input and one output, both with the same number of channels.

Link copied to clipboard
external interface GainOptions : AudioNodeOptions
Link copied to clipboard
external class IIRFilterNode(context: BaseAudioContext, options: IIRFilterOptions) : AudioNode

The IIRFilterNode interface of the Web Audio API is a AudioNode processor which implements a general infinite impulse response (IIR) filter; this type of filter can be used to implement tone control devices and graphic equalizers as well. It lets the parameters of the filter response be specified, so that it can be tuned as needed.

Link copied to clipboard
external interface IIRFilterOptions : AudioNodeOptions
Link copied to clipboard

A MediaElementSourceNode has no inputs and exactly one output, and is created using the AudioContext.createMediaElementSource method. The amount of channels in the output equals the number of channels of the audio referenced by the HTMLMediaElement used in the creation of the node, or is 1 if the HTMLMediaElement has no audio.

Link copied to clipboard
Link copied to clipboard
external class MediaStreamAudioDestinationNode(context: AudioContext, options: AudioNodeOptions = definedExternally) : AudioNode
Link copied to clipboard

A type of AudioNode which operates as an audio source whose media is received from a MediaStream obtained using the WebRTC or Media Capture and Streams APIs.

Link copied to clipboard
Link copied to clipboard

The Web Audio API OfflineAudioCompletionEvent interface represents events that occur when the processing of an OfflineAudioContext is terminated. The complete event implements this interface.

Link copied to clipboard
Link copied to clipboard
open external class OfflineAudioContext(contextOptions: OfflineAudioContextOptions) : BaseAudioContext

An AudioContext interface representing an audio-processing graph built from linked together AudioNodes. In contrast with a standard AudioContext, an OfflineAudioContext doesn't render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer.

Link copied to clipboard
external interface OfflineAudioContextOptions
Link copied to clipboard
external class OscillatorNode(context: BaseAudioContext, options: OscillatorOptions = definedExternally) : AudioScheduledSourceNode

The OscillatorNode interface represents a periodic waveform, such as a sine wave. It is an AudioScheduledSourceNode audio-processing module that causes a specified frequency of a given wave to be created—in effect, a constant tone.

Link copied to clipboard
external interface OscillatorOptions : AudioNodeOptions
Link copied to clipboard
sealed external interface OscillatorType
Link copied to clipboard
sealed external interface OverSampleType
Link copied to clipboard
external class PannerNode(context: BaseAudioContext, options: PannerOptions = definedExternally) : AudioNode

A PannerNode always has exactly one input and one output: the input can be mono or stereo but the output is always stereo (2 channels); you can't have panning effects without at least two audio channels!

Link copied to clipboard
external interface PannerOptions : AudioNodeOptions
Link copied to clipboard
sealed external interface PanningModelType
Link copied to clipboard
external class PeriodicWave(context: BaseAudioContext, options: PeriodicWaveOptions = definedExternally)

PeriodicWave has no inputs or outputs; it is used to define custom oscillators when calling OscillatorNode.setPeriodicWave(). The PeriodicWave itself is created/returned by AudioContext.createPeriodicWave().

Link copied to clipboard
external interface PeriodicWaveConstraints
Link copied to clipboard
Link copied to clipboard
external class StereoPannerNode(context: BaseAudioContext, options: StereoPannerOptions = definedExternally) : AudioNode

The pan property takes a unitless value between -1 (full left pan) and 1 (full right pan). This interface was introduced as a much simpler way to apply a simple panning effect than having to use a full PannerNode.

Link copied to clipboard
Link copied to clipboard
external class WaveShaperNode(context: BaseAudioContext, options: WaveShaperOptions = definedExternally) : AudioNode

A WaveShaperNode always has exactly one input and one output.

Link copied to clipboard
external interface WaveShaperOptions : AudioNodeOptions