art with code

2014-02-17

Saving out video frames from a WebGL app

Recently, I wanted to create a small video clip of a WebGL demo of mine. The route I ended up going down was to send the frames by XHR to a small web server that writes the frames to disk. Here's a quick article detailing how you can do the same.


Here's the video I made.

Set up your web server

I used plain Node.js for my server. It's got CORS headers set, so you can send requests to it from any port or domain you want to use. Heck, you could even do distributed rendering with all the renderers sending finished frames to the server.

Here's the code for the server. It reads POST requests into a buffer and writes the buffer to a file. Simple stuff.

// Adapted from this _WikiBooks OpenGL Programming Video Capture article_.
var port = 3999;
var http = require('http');
var fs = require('fs');
http.createServer(function (req, res) {
    res.writeHead(200, {
        'Access-Control-Allow-Origin': '*',
        'Access-Control-Allow-Headers': 'Content-Type, X-Requested-With'
    });
    if (req.method === 'OPTIONS') {
        // Handle OPTIONS requests to work with JQuery and other libs that cause preflighted CORS requests.
        res.end();
        return;
    }
    var idx = req.url.split('/').pop();
    var filename = ("0000" + idx).slice(-5)+".png";
    var img = new Buffer('');
    req.on('data', function(chunk) {
        img = Buffer.concat([img, chunk]);
    });
    req.on('end', function() {
        var f = fs.writeFileSync(filename, img);
        console.log('Wrote ' + filename);
        res.end();
    });
}).listen(port, '127.0.0.1');
console.log('Server running at http://127.0.0.1:'+port+'/');

To run the server, save it to server.js and run it with node server.js.

Send frames to the server

There are a couple things you need to consider on the WebGL side. First, use a fixed resolution canvas, instead of one that resizes with the browser window. Second, make your timing values fixed as well, instead of using wall clock time. To do this, decide the frame rate for the video (probably 30 FPS) and the duration of the video in seconds. Now you know how much to advance the time with every frame (1/FPS) and how many frames to render (FPS*duration). Third, turn your frame loop into a render & send loop.

To send frames to the server, read in PNG images from the WebGL canvas using toDataURL and send them to the server using XMLHttpRequest. To successfully send the frames to the server, you need to convert them to binary Blobs and send the Blobs instead of the dataURLs. It's pretty simple but can cause an hour of banging your head to the wall (as experience attests). Worry not, I've got it all done in the below snippet, ready to use.

Here's the core of my render & send loop:

var fps = 30; // Frames per second.
var duration = 1; // Video duration in seconds.

// Set the size of your canvas to a fixed value so that all the frames are the same size.
// resize(1024, 768);

var t = 0; // Time in ms.

for (var captureFrame = 0; captureFrame < fps*duration; captureFrame++) {
 // Advance time.
 t += 1000 / fps;

 // Set up your WebGL frame and draw it.
 uniform1f(gl, program, 'time', t/1000);
 gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
 gl.drawArrays(gl.TRIANGLES, 0, 6);

 // Send a synchronous request to the server (sync to make this simpler.)
 var r = new XMLHttpRequest();
 r.open('POST', 'http://localhost:3999/' + captureFrame, false);
 var blob = dataURItoBlob(glc.toDataURL());
 r.send(blob);
}

// Utility function to convert dataURIs to Blobs.
// Thanks go to SO: http://stackoverflow.com/a/15754051
function dataURItoBlob(dataURI) {
 var mimetype = dataURI.split(",")[0].split(':')[1].split(';')[0];
 var byteString = atob(dataURI.split(',')[1]);
 var u8a = new Uint8Array(byteString.length);
 for (var i = 0; i < byteString.length; i++) {
  u8a[i] = byteString.charCodeAt(i);
 }
 return new Blob([u8a.buffer], { type: mimetype });
};

Conclusion

And there we go! All is dandy and rendering frames out to a server is a cinch. Now you're ready to start producing your very own WebGL-powered videos! To turn the frame sequences into video files, either use the sequence as a source in Adobe Media Encoder or use a command-line tool like ffmpeg or avconv: avconv -r 30 -i %05d.png -y output.webm.

To sum up, you now have a simple solution for capturing video from a WebGL app. In our solution, the browser renders out frames and sends them over to a server that writes the frames to disk. Finally, you use a media encoder application to turn the frame sequence into a video file. Easy as pie! Thanks for reading, hope you have a great time making your very own WebGL videos!

Addendum

After posting this on Twitter, I got a bunch of links to other great libs / approaches to do WebGL / Canvas frame capture. Here's a quick recap of them: RenderFlies by @BlurSpline - it's a similar approach to the one above but also calls ffmpeg from inside the server to do video encoding. Then there's this snippet by Steven Wittens - instead of requiring a server for writing out the frames, it uses the FileSystem API and writes them to disk straight from the browser.

And finally, CCapture.js by Jaume Sánchez Elias. CCapture.js hooks up to requestAnimationFrame, Date.now and setTimeout to make fixed timing signals for a WebGL / Canvas animation then captures the frames from the canvas and gives you a nice WebM video file when you're done. And it all happens in the browser, which is awesome! No need to fiddle around with a server.

Thanks for the links and keep 'em coming!

Blog Archive