Package-level declarations
Types
The AnalyserNode
interface represents a node able to provide real-time frequency and time-domain analysis information.
The AudioBuffer
interface represents a short audio asset residing in memory, created from an audio file using the BaseAudioContext/decodeAudioData method, or from raw data using BaseAudioContext/createBuffer.
The AudioBufferSourceNode
interface is an AudioScheduledSourceNode which represents an audio source consisting of in-memory audio data, stored in an AudioBuffer.
The AudioContext
interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode.
The AudioDestinationNode
interface represents the end destination of an audio graph in a given context — usually the speakers of your device.
The AudioListener
interface represents the position and orientation of the unique person listening to the audio scene, and is used in audio spatialization.
The AudioNode
interface is a generic interface for representing an audio processing module.
The Web Audio API's AudioParam
interface represents an audio-related parameter, usually a parameter of an AudioNode (such as GainNode.gain).
The AudioParamMap
interface of the Web Audio API represents an iterable and read-only set of multiple audio parameters.
The AudioScheduledSourceNode
interface—part of the Web Audio API—is a parent interface for several types of audio source node interfaces which share the ability to be started and stopped, optionally at specified times.
The AudioWorklet
interface of the Web Audio API is used to supply custom audio processing scripts that execute in a separate thread to provide very low latency audio processing. Available only in secure contexts.
The AudioWorkletGlobalScope
interface of the Web Audio API represents a global execution context for user-supplied code, which defines custom AudioWorkletProcessor-derived classes.
The AudioWorkletNode
interface of the Web Audio API represents a base class for a user-defined AudioNode, which can be connected to an audio routing graph along with other nodes. Available only in secure contexts.
The AudioWorkletProcessor
interface of the Web Audio API represents an audio processing code behind a custom AudioWorkletNode.
The BaseAudioContext
interface of the Web Audio API acts as a base definition for online and offline audio-processing graphs, as represented by AudioContext and OfflineAudioContext respectively.
The BiquadFilterNode
interface represents a simple low-order filter, and is created using the BaseAudioContext/createBiquadFilter method.
The ChannelMergerNode
interface, often used in conjunction with its opposite, ChannelSplitterNode, reunites different mono inputs into a single output.
The ChannelSplitterNode
interface, often used in conjunction with its opposite, ChannelMergerNode, separates the different channels of an audio source into a set of mono outputs.
The ConstantSourceNode
interface—part of the Web Audio API—represents an audio source (based upon AudioScheduledSourceNode) whose output is single unchanging value.
The ConvolverNode
interface is an AudioNode that performs a Linear Convolution on a given AudioBuffer, often used to achieve a reverb effect.
The DelayNode
interface represents a delay-line; an AudioNode audio-processing module that causes a delay between the arrival of an input data and its propagation to the output.
The DynamicsCompressorNode
interface provides a compression effect, which lowers the volume of the loudest parts of the signal in order to help prevent clipping and distortion that can occur when multiple sounds are played and multiplexed together at once.
The GainNode
interface represents a change in volume.
The IIRFilterNode
interface of the Web Audio API is a AudioNode processor which implements a general infinite impulse response (IIR) filter; this type of filter can be used to implement tone control devices and graphic equalizers as well.
The MediaElementAudioSourceNode
interface represents an audio source consisting of an HTML audio or video element.
The MediaStreamAudioDestinationNode
interface represents an audio destination consisting of a WebRTC MediaStream with a single AudioMediaStreamTrack
, which can be used in a similar way to a MediaStream
obtained from MediaDevices.getUserMedia.
The MediaStreamAudioSourceNode
interface is a type of AudioNode which operates as an audio source whose media is received from a MediaStream obtained using the WebRTC or Media Capture and Streams APIs.
The Web Audio API OfflineAudioCompletionEvent
interface represents events that occur when the processing of an OfflineAudioContext is terminated.
The OfflineAudioContext
interface is an AudioContext interface representing an audio-processing graph built from linked together AudioNodes.
The OscillatorNode
interface represents a periodic waveform, such as a sine wave.
The PannerNode
interface defines an audio-processing object that represents the location, direction, and behavior of an audio source signal in a simulated physical space.
The PeriodicWave
interface defines a periodic waveform that can be used to shape the output of an OscillatorNode.
The StereoPannerNode
interface of the Web Audio API represents a simple stereo panner node that can be used to pan an audio stream left or right.
The WaveShaperNode
interface represents a non-linear distorter.