I have a project that uses JMF, and records for a short time (a few seconds to a couple of minutes) both the web camera, and audio inputs, and then writes the results to a file.
The problem with my project is that this file is never produced properly, and cannot be played back.
While I've found numerous examples of how to do multiplexed transmission of audio and video over RTP, or conversion of an input file from one format to another , I haven't seen a working example yet that captures audio and video, and writes it to a file.
Does anyone have an example of functioning code to do this?
I've found the reason why I was not able to generate a file from two separate capture devices under JMF, and it relates to ordering of the start commands. In particular, things like Processors will take a datasource, or merging datasource, assign and synchronize the time base(s) and start/stop the sources for you, so the extra work I was trying to do starting the datasources manually is utterly redundant, and throws a wrench in the works.
This was a lot of painful trial and error, and I would suggest you read every line of code, understand the sequencing, and understand what has been included, and what has been left out and why before trying to implement this yourself. JMF is quite the bear if you're not careful.
Oh, and remember to catch exceptions. I had to omit that code due to length restrictions.
Here's my final solution: