W3cubDocs

/DOM

AnalyserNode

The AnalyserNodeinterface represents a node able to provide real-time frequency and time-domain analysis information. It is an AudioNode that passes the audio stream unchanged from the input to the output, but allows you to take the generated data, process it, and create audio visualizations.

An AnalyserNode has exactly one input and one output. The node works even if the output is not connected.

Without modifying the audio stream, the node allows to get the frequency and time-domain data associated to it, using a FFT.

Number of inputs 1
Number of outputs 1 (but may be left unconnected)
Channel count mode "explicit"
Channel count 1
Channel interpretation "speakers"

Inheritance

This interface inherits from the following parent interfaces:

Constructor

AnalyserNode()
Creates a new instance of an AnalyserNode object.

Properties

Inherits properties from its parent, AudioNode.

AnalyserNode.fftSize
Is an unsigned long value representing the size of the FFT (Fast Fourier Transform) to be used to determine the frequency domain.
AnalyserNode.frequencyBinCount Read only
Is an unsigned long value half that of the FFT size. This generally equates to the number of data values you will have to play with for the visualization.
AnalyserNode.minDecibels
Is a double value representing the minimum power value in the scaling range for the FFT analysis data, for conversion to unsigned byte values — basically, this specifies the minimum value for the range of results when using getByteFrequencyData().
AnalyserNode.maxDecibels
Is a double value representing the maximum power value in the scaling range for the FFT analysis data, for conversion to unsigned byte values — basically, this specifies the maximum value for the range of results when using getByteFrequencyData().
AnalyserNode.smoothingTimeConstant
Is a double value representing the averaging constant with the last analysis frame — basically, it makes the transition between values over time smoother.

Methods

Inherits methods from its parent, AudioNode.

AnalyserNode.getFloatFrequencyData()
Copies the current frequency data into a Float32Array array passed into it.
AnalyserNode.getByteFrequencyData()
Copies the current frequency data into a Uint8Array (unsigned byte array) passed into it.
AnalyserNode.getFloatTimeDomainData()
Copies the current waveform, or time-domain, data into a Float32Array array passed into it.
AnalyserNode.getByteTimeDomainData()
Copies the current waveform, or time-domain, data into a Uint8Array (unsigned byte array) passed into it.

Examples

Note: See the guide Visualizations with Web Audio API for more information on creating audio visualizations.

Basic usage

The following example shows basic usage of an AudioContext to create an AnalyserNode, then requestAnimationFrame and <canvas> to collect time domain data repeatedly and draw an "oscilloscope style" output of the current audio input. For more complete applied examples/information, check out our Voice-change-O-matic demo (see app.js lines 128–205 for relevant code).

var audioCtx = new(window.AudioContext || window.webkitAudioContext)();
var analyser = audioCtx.createAnalyser();

// ...

analyser.fftSize = 2048;
var bufferLength = analyser.frequencyBinCount;
var dataArray = new Uint8Array(bufferLength);
analyser.getByteTimeDomainData(dataArray);

// Get a canvas defined with ID "oscilloscope"
var canvas = document.getElementById("oscilloscope");
var canvasCtx = canvas.getContext("2d");

// draw an oscilloscope of the current audio source

function draw() {

  drawVisual = requestAnimationFrame(draw);

  analyser.getByteTimeDomainData(dataArray);

  canvasCtx.fillStyle = 'rgb(200, 200, 200)';
  canvasCtx.fillRect(0, 0, canvas.width, canvas.height);

  canvasCtx.lineWidth = 2;
  canvasCtx.strokeStyle = 'rgb(0, 0, 0)';

  canvasCtx.beginPath();

  var sliceWidth = canvas.width * 1.0 / bufferLength;
  var x = 0;

  for (var i = 0; i < bufferLength; i++) {

    var v = dataArray[i] / 128.0;
    var y = v * canvas.height / 2;

    if (i === 0) {
      canvasCtx.moveTo(x, y);
    } else {
      canvasCtx.lineTo(x, y);
    }

    x += sliceWidth;
  }

  canvasCtx.lineTo(canvas.width, canvas.height / 2);
  canvasCtx.stroke();
};

draw();

Specifications

Browser compatibility

Feature Chrome Edge Firefox Internet Explorer Opera Safari
Basic support 14 Yes 25 No 15 6
AnalyserNode() constructor 55 ? 53 No 42 ?
fftSize 14 Yes 25 No 15 6
frequencyBinCount 14 Yes 25 No 15 6
minDecibels 14 Yes 25 No 15 6
maxDecibels 14 Yes 25 No 15 6
smoothingTimeConstant 14 Yes 25 No 15 6
getFloatFrequencyData 14 Yes 25 No 15 6
getByteFrequencyData 14 Yes 25 No 15 6
getFloatTimeDomainData 14 Yes 25 No 15 6
getByteTimeDomainData 14 Yes 25 No 15 6
Feature Android webview Chrome for Android Edge mobile Firefox for Android IE mobile Opera Android iOS Safari
Basic support Yes 14 Yes 26 No 15 ?
AnalyserNode() constructor 55 55 ? 53 No 42 ?
fftSize Yes 14 Yes 26 No 15 ?
frequencyBinCount Yes 14 Yes 26 No 15 ?
minDecibels Yes 14 Yes 26 No 15 ?
maxDecibels Yes 14 Yes 26 No 15 ?
smoothingTimeConstant Yes 14 Yes 26 No 15 ?
getFloatFrequencyData Yes 14 Yes 26 No 15 ?
getByteFrequencyData Yes 14 Yes 26 No 15 ?
getFloatTimeDomainData Yes 14 Yes 26 No 15 ?
getByteTimeDomainData Yes 14 Yes 26 No 15 ?

See also

© 2005–2018 Mozilla Developer Network and individual contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode