WebAudio: live input

October 28th, 2012. Tagged: JavaScript, Music, WebAudio

Live input, aka getUserMedia: it exists in Chrome Canary for audio too. Great times to be a web developer, right?

Let's check it out.

Here's the demo, but first a prerequisite: go chrome://flags, search for "Web Audio Input" and enable it. Restart Chrome Canary.

With a guitar

I wanted to have a little trickier setup and capture guitar sound not just voice with a microphone.

As always, it was bigger hurdle to get guitar sound to the computer, than anything else JavaScript-wise.

I have a guitar amp that has a mini-USB out. This goes to the USB of the computer. Wrestle, system settings, garage band to the rescue.... eventually the computer makes sound.

Capturing

I was assuming the stream you get from getuserMedia can go directly to an HTML <audio> src. No such luck. Works for video but not yet for audio.

So... WebAudio API saves the day.

Setting up audio context (like in the previous post), shimming getUserMedia and setting up callbacks for it:

  // for logging
  function fire(e, data) {    
    log.innerHTML += "\n" + e + " " + (data || '');
  }
 
  // globals
  var audio_context;
  var volume;
 
  // one-off initialization
  (function init(g){
    try {
      audio_context = new (g.AudioContext || g.webkitAudioContext);
      fire('Audio context OK');
      // shim
      navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia;
      fire('navigator.getUserMedia ' + (navigator.getUserMedia ? 'OK' : 'fail'));
      // use
      navigator.getUserMedia(
        {audio:true},
        iCanHazUserMedia, 
        function(e){fire('No live audio input ' + e);}
      );
    } catch (e) {
      alert('No web audio support in this browser');
    }
  }(window));

When the user loads the page, here's what they see:

In my case I select the guitar amp and click "Allow" button.

This little window informs me the page is using the audio input:

Playing back

Now that the user has allowed audio access, let's play back the audio we receive, but pass it through a volume control.

All this work happens in the iCanhazUserMedia(), the success callback to getUserMedia.

  function iCanHazUserMedia(stream) {
    
    fire('I haz live stream');
    
    var input = audio_context.createMediaStreamSource(stream);
    volume = audio_context.createGainNode();
    volume.gain.value = 0.8;
    input.connect(volume);
    volume.connect(audio_context.destination);
    
    fire('input connected to destination');
  }

What we have here (ignoring fire()):

  1. setup an input stream from the user stream, this is the first node in the audio chain
  2. setup a volume (Gain) node with initial volume 0.8 out of 1
  3. connect input to volume to output/speakers

And this is it!

Additionally an input type=range max=1 step=0.1 can change the volume via volume.gain.value = value;

Go play! Isn't it amazing that you can now grab microphone or any other audio input and play around with it? All in JavaScript, all in the browser without any plugins.

Moar!

This was a very basic exploratory/primer example. For more:

Tell your friends about this post: Facebook, Twitter, Google+

Sorry, comments disabled and hidden due to excessive spam. Working on restoring the existing comments...

Meanwhile, hit me up on twitter @stoyanstefanov