W3cubDocs

/DOM

AudioContext

The AudioContext interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext before you do anything else, as everything happens inside a context.

Constructor

AudioContext()
Creates and returns a new AudioContext object.

Properties

Also inherits properties from its parent interface, BaseAudioContext.

AudioContext.baseLatency Read only
Returns the number of seconds of processing latency incurred by the AudioContext passing the audio from the AudioDestinationNode to the audio subsystem.
AudioContext.outputLatency Read only
Returns an estimation of the output latency of the current audio context.

Methods

Also inherits methods from its parent interface, BaseAudioContext.

AudioContext.close()
Closes the audio context, releasing any system audio resources that it uses.
AudioContext.createMediaElementSource()
Creates a MediaElementAudioSourceNode associated with an HTMLMediaElement. This can be used to play and manipulate audio from <video> or <audio> elements.
AudioContext.createMediaStreamSource()
Creates a MediaStreamAudioSourceNode associated with a MediaStream representing an audio stream which may come from the local computer microphone or other sources.
AudioContext.createMediaStreamDestination()
Creates a MediaStreamAudioDestinationNode associated with a MediaStream representing an audio stream which may be stored in a local file or sent to another computer.
AudioContext.createMediaStreamTrackSource()
Creates a MediaStreamTrackAudioSourceNode associated with a MediaStream representing an media stream track.
AudioContext.getOutputTimestamp()
Returns a new AudioTimestamp object containing two correlated context's audio stream position values.
AudioContext.resume()
Resumes the progression of time in an audio context that has previously been suspended.
AudioContext.suspend()
Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.

Examples

Basic audio context declaration:

var audioCtx = new AudioContext();

Cross browser variant:

var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();

var oscillatorNode = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
var finish = audioCtx.destination;
// etc.

Specifications

Browser compatibility

Feature Chrome Edge Firefox Internet Explorer Opera Safari
Basic support

35

14 — 57 webkit

Yes 25 No

22

15 — 44 webkit

6 webkit
AudioContext() constructor 55 Yes 25 No 42 Yes webkit
baseLatency 60 ? No No 47 No
outputLatency Yes ? No No Yes No
close 43 ? 40 No Yes ?
createMediaElementSource 14 Yes 25 No 15 6
createMediaStreamSource 14 Yes 25 No 15 6
createMediaStreamDestination 14 Yes 25 No 15 6
createMediaStreamTrackSource ? ? No No ? No
getOutputTimestamp 57 ? No No 44 No
suspend 43 ? 40 No Yes ?
Feature Android webview Chrome for Android Edge mobile Firefox for Android IE mobile Opera Android iOS Safari
Basic support Yes

35

14 — 57 webkit

Yes 26 No

22

15 — 44 webkit

?
AudioContext() constructor 55 55 ? 25 No 42 ?
baseLatency 60 60 ? No No 47 No
outputLatency Yes Yes ? No No Yes ?
close 43 43 ? 40 No Yes ?
createMediaElementSource Yes 14 Yes 26 No 15 ?
createMediaStreamSource Yes 14 Yes 26 No 15 ?
createMediaStreamDestination Yes 14 Yes 26 No 15 ?
createMediaStreamTrackSource ? ? ? No No ? No
getOutputTimestamp 57 57 ? No No 44 No
suspend 43 43 ? 40 No Yes ?

See also

© 2005–2018 Mozilla Developer Network and individual contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext