W3cubDocs

/DOM

WebGL API: Animating textures in WebGL

In this demonstration, we build upon the previous example by replacing our static textures with the frames of a playing mp4 video file. This is actually pretty easy to do, but is fun to look at, so let's get started. Similar code can be used to use any sort of data (such as a <canvas>) as the source for your textures.

Getting access to the video

The first step is to create the <video> element that we'll use to retrieve the video frames:

// will set to true when video can be copied to texture
var copyVideo = false;

function setupVideo(url) {
  const video = document.createElement('video');

  var playing = false;
  var timeupdate = false;

  video.autoplay = true;
  video.muted = true;
  video.loop = true;

  // Waiting for these 2 events ensures
  // there is data in the video

  video.addEventListener('playing', function() {
     playing = true;
     checkReady();
  }, true);

  video.addEventListener('timeupdate', function() {
     timeupdate = true;
     checkReady();
  }, true);

  video.src = url;
  video.play();

  function checkReady() {
    if (playing && timeupdate) {
      copyVideo = true;
    }
  }

  return video;
}

First we create a video element. We set it to autoplay, mute the sound, and loop the video. We then setup 2 events to see that the video is playing and the time has been updated. We need both of these checks because it is an error to try to upload a video to WebGL that has no data available yet. Checking for both of these events guarantees there is data available and it's safe to start uploading video to a WebGL texture. In the code above we check if we got both of those events and if so we set a global variable, copyVideo, to true to tell us it's safe to start copying the video to a texture.

And finally we set the src attribute to start and call play to start loading and playing the video.

Using the video frames as a texture

The next change is to initTexture(), which becomes much simpler, since it no longer needs to load an image file. Instead, all it does is create an empty texture object, put a single pixel it in and set its filtering for later use:

function initTexture(gl, url) {
  const texture = gl.createTexture();
  gl.bindTexture(gl.TEXTURE_2D, texture);

  // Because video has to be download over the internet
  // they might take a moment until it's ready so
  // put a single pixel in the texture so we can
  // use it immediately.
  const level = 0;
  const internalFormat = gl.RGBA;
  const width = 1;
  const height = 1;
  const border = 0;
  const srcFormat = gl.RGBA;
  const srcType = gl.UNSIGNED_BYTE;
  const pixel = new Uint8Array([0, 0, 255, 255]);  // opaque blue
  gl.texImage2D(gl.TEXTURE_2D, level, internalFormat,
                width, height, border, srcFormat, srcType,
                pixel);

  // Turn off mips and set  wrapping to clamp to edge so it
  // will work regardless of the dimensions of the video.
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);

  return texture;
}
Here's what the updateTexture() function looks like; this is where the real work is done:
function updateTexture(gl, texture, video) {
  const level = 0;
  const internalFormat = gl.RGBA;
  const srcFormat = gl.RGBA;
  const srcType = gl.UNSIGNED_BYTE;
  gl.bindTexture(gl.TEXTURE_2D, texture);
  gl.texImage2D(gl.TEXTURE_2D, level, internalFormat,
                srcFormat, srcType, video);
}

You've seen this code before. It's nearly identical to the image onload function in the previous example, except when we call texImage2D(), instead of passing an Image object, we pass in the <video> element. WebGL knows how to pull the current frame out and use it as a texture.

If copyVideo is true then updateTexture() is called each time just before we call the drawScene() function.

  var then = 0;

  // Draw the scene repeatedly
  function render(now) {
    now *= 0.001;  // convert to seconds
    const deltaTime = now - then;
    then = now;

    if (copyVideo) {
      updateTexture(gl, texture, video);
    }

    drawScene(gl, programInfo, buffers, texture, deltaTime);

    requestAnimationFrame(render);
  }
  requestAnimationFrame(render);

That's all there is to it!

View the complete code | Open this demo on a new page

See also

© 2005–2018 Mozilla Developer Network and individual contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Tutorial/Animating_textures_in_WebGL