Recording Wavesurfer output with MediaRecorder

1.2k views Asked by At

I am working on a React web application that has multiple instances of Wavesurfer audio players that can play concurrently. Ideally I would like to be able to play several of them at once, and then click 'Record' on another button to record the mix of sounds. Before attempting this I am writing a smaller app to just record the output from one Wavesurfer instance into a MediaRecorder object, taking my cues from here

Below is my code for the main component - what I am expecting to happen is the user clicks play on the audio, clicks the record button, then clicks to stop recording and the <audio> element should then house the recorded audio:

import React, { useState, useEffect, useRef } from 'react';
import WaveSurfer from 'wavesurfer.js';

function Waveform({ src }) {
  const [wavesurferPlayer, setWavesurferPlayer] = useState(null);
  const [isLooping, setIsLooping] = useState(false);
  const [isPlaying, setIsPlaying] = useState(false);
  const waveContainerRef = useRef();

  const AudioContext = window.AudioContext || window.webkitAudioContext;
  const ctx = new AudioContext();
  let chunks = [];
  const dest = ctx.createMediaStreamDestination();
  const mediaRecorder = new MediaRecorder(dest.stream);

  mediaRecorder.ondataavailable = function(evt) {
    console.log(evt);
    chunks.push(evt.data);
  }

  mediaRecorder.onstop = function(evt) {
    console.log('stopped');
    let blob = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' });
    console.log(blob);
    document.getElementById("blargaudio").src = URL.createObjectURL(blob);
  }

  useEffect(() => {
    const wavesurferInstance = WaveSurfer.create({
      audioContext: ctx,
      container: waveContainerRef.current,
      waveColor: "#999",
      progressColor: "#00ffaf",
      backend: "WebAudio",
      responsive: true,
    });

    wavesurferInstance.load(src);

    wavesurferInstance.fireEvent('ready');

    // const bufferSource = wavesurferInstance.backend.ac.createBufferSource();
    // bufferSource.buffer = wavesurferInstance.backend.buffer;
    // bufferSource.connect(wavesurferInstance.backend.ac.destination);

    setWavesurferPlayer(wavesurferInstance);
  }, []);

  useEffect(() => {
    if (wavesurferPlayer) {
      if (isLooping) {
        wavesurferPlayer.on('finish', () => {
          wavesurferPlayer.play();
        });
      } else {
        wavesurferPlayer.on('finish', () => {
          wavesurferPlayer.pause();
        });
      }
    }

  }, [wavesurferPlayer, isLooping]);

  const handlePlayClick = () => {
    if (isPlaying) {
      setIsPlaying(false);
      wavesurferPlayer.pause();
    } else {
      setIsPlaying(true);
      wavesurferPlayer.play();
    }
  }

  const startRecording = () => {
    mediaRecorder.start();
  }

  const stopRecording = () => {
    mediaRecorder.requestData();
    mediaRecorder.stop();
  }

  return (
    <div className="waveform">
      <button onClick={handlePlayClick}>{isPlaying? 'Pause' : 'Play'}</button>
      <button onClick={() => setIsLooping(!isLooping)}>Loop</button>
      <p>{isLooping ? 'Looping on' : 'Looping off'}</p>
      <div className="wave" ref={waveContainerRef}></div>
      <div>
        <button onClick={() => startRecording()}>RECORD</button>
        <button onClick={() => stopRecording()}>STOP</button>
        <audio id="blargaudio" controls></audio>
      </div>
    </div>
  )
}

export default Waveform

At this point I can't seem to figure out how to connect the output of the Wavesurfer instance to the context destination node. You may notice in the no-dependency useEffect that there is some code that I took from this question, however the logs reveal that no data is coming through and the old question is missing some useful information. Any advice would be most appreciated. Having said that, even if this works the next hurdle will be to connect multiple players to the one destination, and whether that will work or not.

0

There are 0 answers