SIGN UP

Web SDK: Microphone Activity Visualization

Web SDK: Microphone Activity Visualization

 

If you are embedding VoIP functionality into your web application then it's a good idea to show end user some feedback from the system about his audio settings, especially related to his microphone / audio recording device. Combining the power of WebRTC , HTML5 and Web we can visualize audio stream coming from microphone after user allowed access to the device. Since version 3.6.294 of the Web SDK,  developers can access to MediaStream object - it's available as stream property of MicAccessResult event.
In our example we will use Canvas to render microphone activity bar:

 

<canvas id="mic_activity" style="border: solid 1px #ccc;" width="30" height="150"></canvas>

and the following JavaScript code:

const vox = VoxImplant.getInstance();
let javascriptNode;
vox.addEventListener(VoxImplant.Events.MicAccessResult, onMicAccessResult);

vox.init({micRequired: true})
  .then(()=>{
    return vox.connect();
  })
  .catch(()=>{
    console.log('Can\'t connect to the Voximplant cloud.')
  });

function onMicAccessResult(e) {
  if (e.result) createMicActivityIndicator(e.stream);
}

function createMicActivityIndicator(stream) {
  const audioContext = new AudioContext();
  const analyser = audioContext.createAnalyser();
  const microphone = audioContext.createMediaStreamSource(stream);
  javascriptNode = audioContext.createScriptProcessor(2048, 1, 1);
  analyser.smoothingTimeConstant = 0.3;
  analyser.fftSize = 1024;
  microphone.connect(analyser);
  analyser.connect(javascriptNode);
  javascriptNode.connect(audioContext.destination);
  const ctx = document.getElementById('mic_activity').getContext('2d');
  javascriptNode.onaudioprocess = function() {
    const array =  new Uint8Array(analyser.frequencyBinCount);
    analyser.getByteFrequencyData(array);
    let values = 0;
    const length = array.length;
    for (let i = 0; i < length; i++) {
      alues += array[i];
    }
    const average = values / length;
    ctx.clearRect(0, 0, 30, 150);
    const grad = ctx.createLinearGradient(1,1,28,148);
    grad.addColorStop(0,'#FF0000');
    grad.addColorStop(0.5, 'yellow');
    grad.addColorStop(1,'#00FF00');
    ctx.fillStyle=grad;
    ctx.fillRect(1,148-average,28,148);
  }
}

That's it! After user allows access to his microphone MediaStream will be passed to createMicActivityIndicator function that uses WebAudio and Canvas for visualization. You can grab full code example at Gist.

Tags:web sdk
B6A24216-9891-45D1-9D1D-E7359CEB8282 Created with sketchtool.

Answers(0)

Add your comment

Please complete this field.

Recommend

Get your free developer account or talk with our sales team to learn more about Voximplant solutions
SIGN UP