Voximplant. Blog

Web SDK: Microphone Activity Visualization

Mic Activity

If you are embedding VoIP functionality into your web application then it’s a good idea to show end user some feedback from the system about his audio settings, especially related to his microphone / audio recording device. Combining the power of WebRTC , HTML5 and Web we can visualize audio stream coming from microphone after user allowed access to the device. The last version of Web SDK (3.6.294) provides developers with access to MediaStream object – it’s available as stream property of MicAccessResult event, or as successCallback function argument which itself is an argument of the following functions attachRecordingDevice, useAudioSource, useVideoSource, setVideoSettings.

In our example we will use Canvas to render microphone activity bar:

<canvas id="mic_activity" width="30" height="150" style="border: solid 1px #ccc;"></canvas>

and the following JavaScript code:

  1. var vox = VoxImplant.getInstance(),
  2. javascriptNode;
  3. vox.addEventListener(VoxImplant.Events.SDKReady, function(e) { vox.connect(); });
  4. vox.addEventListener(VoxImplant.Events.MicAccessResult, onMicAccessResult);
  5.  
  6. vox.init({
  7. micRequired: true
  8. });
  9.  
  10. function onMicAccessResult(e) {
  11. if (e.result) createMicActivityIndicator(e.stream);
  12. }
  13.  
  14. function createMicActivityIndicator(stream) {
  15.  
  16. var audioContext = new AudioContext(),
  17. analyser = audioContext.createAnalyser(),
  18. microphone = audioContext.createMediaStreamSource(stream);
  19.  
  20. javascriptNode = audioContext.createScriptProcessor(2048, 1, 1);
  21.  
  22. analyser.smoothingTimeConstant = 0.3;
  23. analyser.fftSize = 1024;
  24.  
  25. microphone.connect(analyser);
  26. analyser.connect(javascriptNode);
  27. javascriptNode.connect(audioContext.destination);
  28.  
  29. var ctx = document.getElementById("mic_activity").getContext("2d");
  30.  
  31. javascriptNode.onaudioprocess = function() {
  32. var array = new Uint8Array(analyser.frequencyBinCount);
  33. analyser.getByteFrequencyData(array);
  34. var values = 0,
  35. length = array.length;
  36.  
  37. for (var i = 0; i < length; i++) values += array[i];
  38.  
  39. var average = values / length;
  40. ctx.clearRect(0, 0, 30, 150);
  41. var grad = ctx.createLinearGradient(1,1,28,148);
  42. grad.addColorStop(0,"#FF0000");
  43. grad.addColorStop(0.5, "yellow");
  44. grad.addColorStop(1,"#00FF00");
  45. ctx.fillStyle=grad;
  46. ctx.fillRect(1,148-average,28,148);
  47. }
  48.  
  49. }

That’s it! After user allows access to his microphone MediaStream will be passed to createMicActivityIndicator function that uses WebAudio and Canvas for visualization. You can grab full code example at Gist.

Comments