I'm working on an app running in a browser where the user can record his voice. Using the MediaRecorder API I can get the recording and POST it to my server. The problem comes in if the user pauses and restarts the recording. When that happens, the first time they get it from the server it plays fine, but when it is replayed only the last segment is played. If I reload the web page and try again, once again the first time I get the whole recording, subsequent times it is just the last segment.
It is using opus codec, so I tried playing it in VLC. There, I get only the 1st segment, never any of the subsequent ones. Finally, I tried converting to MP3, and when I do that - the MP3 has the whole recording! So the whole thing is being saved, but somehow the segments seem to be stored in the file and mess up replay. In each case only one segment of the blob is playing. I have to say I'm at something of a loss even as how to attack this. The time showed by the player is the time of the 1st segment, whether it plays the first, second, or the whole thing. Any thoughts?
edited to provide a working example
How to test: put this code where it can be served and open it (I use a chrome-based browser). Click on Start to start recording, then Pause, then Start again to continue recording, then Pause again to stop. Then click on setup to load the recording, and listen to the recording. The first time I listen I get the whole recording, though the playback timer only shows the first segment. Subsequent playbacks only play the last segment. Pressing the setup button again will cause it to play the whole recording, but again only the first time.
<!doctype html>
<html>
<head>
<script>
var audio = null;
function init() {
audio = new Recording();
audio.prepareAudio();
}
class Recording {
recordButton;
pauseButton;
recorder;
mimeType;
audioChunks;
constructor() {
this.mimeType = this.recorder = null;
this.recordButton = this.pauseButton = null;
this.audioChunks = [];
}
getRecorder() {
return new Promise(async resolve => {
const stream = await navigator.mediaDevices.getUserMedia({ audio: true });
const mediaRecorder = new MediaRecorder(stream);
var _this = this;
mediaRecorder.addEventListener('dataavailable', event => {
_this.audioChunks.push(event.data);
});
const start = () => {
mediaRecorder.start();
};
const pause = () =>
new Promise(resolve => {
mediaRecorder.addEventListener('stop', () => {
resolve(_this.audioChunks);
});
mediaRecorder.stop();
_this.mimeType = mediaRecorder.mimeType;
});
resolve({ start, pause });
});
}
get codec() {
if (!this.mimeType) return null;
let split = this.mimeType.split('=');
return (split.length > 1) ? split[1] : split;
}
prepareAudio() {
this.recordButton = document.querySelector('#record');
this.pauseButton = document.querySelector('#pause');
var _this = this;
this.recordButton.addEventListener('click', async () => {
_this.recordButton.setAttribute('disabled', true);
_this.pauseButton.removeAttribute('disabled');
if (!_this.recorder) {
_this.recorder = await this.getRecorder();
}
_this.recorder.start();
});
this.pauseButton.addEventListener('click', async () => {
_this.recordButton.removeAttribute('disabled');
_this.pauseButton.setAttribute('disabled', true);
await _this.recorder.pause();
});
}
data() {
return new Promise((resolve, reject) => {
const reader = new FileReader();
let codec = this.audioChunks[0].type;
let audioBlob = new Blob(this.audioChunks, {type: this.codec || codec});
reader.readAsDataURL(audioBlob);
reader.onload = () => resolve({codec: codec, data: reader.result.split(',')[1]});
});
}
blobUrl() {
let codec = this.audioChunks[0].type;
let audioBlob = new Blob(this.audioChunks, {type: this.codec || codec});
let blobUrl = URL.createObjectURL(audioBlob);
let player = document.getElementById('play-blob');
player.src = blobUrl;
player.disabled = false;
return blobUrl;
}
}
</script>
</head>
<body onload="init()">
<div>
<button id="record" class="button fill" >Start</button>
<br />
<button id="pause" class="button fill" >Pause</button>
<br />
<button class="button fill" onclick="audio.blobUrl()">setup</button>
</div>
<audio id="play-blob" controls></audio>
</body>
</html>
This isn't a complete answer, but I'm understanding more of what is happening. The audio player, at least for the versions of Chrome and Firefox I am using (up-to-date), does not seem to handle streaming audio properly. When the source is loaded it does not know the length (of course). When the blob is created with multiple segments (new Blob([segment1, segment2, ...])) the first time the duration is given as infinite, and the whole clip plays. On subsequent plays the clip time is given as the length of the longest segment and only the last segment is played. The audio object gives the duration as the length of the longest segment.
I've also solved my immediate problem by replacing the audio device, using howler. That plays the entire clip as I expected repeatedly.