I want to develop a web App for mobile phones that records audio from the microphone and plays music at the same time.
With getUserMedia()
I get the stream and create in my AudioContext a MediaStreamSource. At the same time I create a BufferSource which plays music. In Chrome this setup works. But when I start the same Web app in Chrome on my Nexus 5 and I allow it to use the microphone the music is muted.
Success Callback for getUserMedia:
function gotStream(stream) {
mediaStreamSource = audioContext.createMediaStreamSource(stream);
meter = createAudioMeter(audioContext);
mediaStreamSource.connect(meter);
info = document.getElementById('info');
outputData();
}
Play Music Function:
function playSound(buffer) {
source = audioContext.createBufferSource();
source.buffer = buffer;
gainNode = audioContext.createGain();
source.connect(gainNode);
gainNode.connect(audioContext.destination);
source.start(0);
}
Should that be the expected behaviour or am I doing something wrong?