I am currently creating a video editor with webcodecs. I would like to use FFmepg/Wasm for muxing, as I previously worked with mp4box.js and encountered many problems.
The question I have is how best to give FFMPEG the raw encoded chunks.
Unfortunately, I have never worked with WASM and FFMPEG before.
How I imagined the pipeline:
I saved all encodedChunks in arrays like:
encodedVideoChunks: EncodedVideoChunk[];
encodedAudioChunks: EncodedAudioChunk[];
A Muxer_FFmpeg class should now handle the encoded Chunks. Thats where I am stuck.
You got an example of how to deal with encoded chunks on web.dev (https://web.dev/webcodecs/#encoding).
You basically copy them to a typed array and store them for later use:
Muxing is a simple ffmpeg command with chunks as input. Just have to avoid reencoding with codec copy. For webm container:
For other ffmpeg common commands: https://web.dev/media-conversion/
FFmpeg.wasm FS API: https://github.com/ffmpegwasm/ffmpeg.wasm/blob/master/docs/api.md#ffmpeg-fs
In the case of example above, you would use fetchFile API of chunks Blob so command would be: