Monotouch AVAssetReader

822 views Asked by At

I'm trying to convert a sample from objective C to Monotouch, and I have run into some difficulties.

Basically I want to read a video file, and decode the frames one by one to an opengl texture.

The key to doing this is to use the AVAssetReader, but I am not sure how to set this up properly in Monotouch.

This is my code:

    AVUrlAsset asset=new AVUrlAsset(NSUrl.FromFilename(videoFileName),null);
    assetReader=new AVAssetReader(asset,System.IntPtr.Zero);
    AVAssetTrack videoTrack=asset.Tracks[0];
    NSDictionary videoSettings=new NSDictionary();

    NSString key = CVPixelBuffer.PixelFormatTypeKey;
    NSNumber val=0x754b9d0; //NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; - Had to hardcode constant as it is not defined in Monotouch?

    videoSettings.SetNativeField(key,val);

//**The program crashes here:
    AVAssetReaderTrackOutput trackOutput=new AVAssetReaderTrackOutput(videoTrack,videoSettings);

    assetReader.AddOutput(trackOutput);
    assetReader.StartReading();

The program crashes on the line indicated above, with an invalid argument exception, indicating that the content of the NSDictionary is not in the right format? I have checked the video file, and it loads well, "asset" contains valid information about the video.

This is the original Objective C code:

                NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
                NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
                NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
                AVAssetReaderTrackOutput *trackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoSettings];

                [_assetReader addOutput:trackOutput];
                [_assetReader startReading];

I'm not that into Objective C, so any help is appreciated.

EDIT: I used the code suggested below

var videoSettings = NSDictionary.FromObjectAndKey (
  new NSNumber ((int) MonoTouch.CoreVideo.CVPixelFormatType.CV32BGRA),
  MonoTouch.CoreVideo.CVPixelBuffer.PixelFormatTypeKey);

And the program no longer crashes. By using the following code:

        CMSampleBuffer buffer=assetReader.Outputs[0].CopyNextSampleBuffer();
        CVImageBuffer imageBuffer = buffer.GetImageBuffer();

I get the image buffer which should contain the next frame in the video file. By inspecting the imageBuffer object, I find it has valid data such as the width and height, matching that of the video file.

However, the imageBuffer BaseAddress is always 0, which indicates the image has no data? I tried to do this as a test:

        CVPixelBuffer buffer=(CVPixelBuffer)imageBuffer;
        CIImage image=CIImage.FromImageBuffer(buffer);

And image is always returned as null. Does this mean the actual image data is not present, and my imageBuffer object only contains the frame header info?

And if so, is this a bug in Monotouch, or am I setting this up wrong?

I had an idea that I may need to wait for the image data to be ready, but in that case, I do not know how either. Pretty stuck now...

2

There are 2 answers

0
Rolf Bjarne Kvinge On BEST ANSWER

You need to create the NSDictionary like this:

var videoSettings = NSDictionary.FromObjectAndKey (
  new NSNumber ((int) MonoTouch.CoreVideo.CVPixelFormatType.CV32BGRA),
  MonoTouch.CoreVideo.CVPixelBuffer.PixelFormatTypeKey);

SetNativeField is something completely different (you're setting the field named CVPixelBuffer.PixelFormatTypeKey to 0x754b9d0, not adding a key/value pair to the dictionary).

0
poupou On

[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; - Had to hardcode constant as it is not defined in Monotouch?

You should be able to replace this with:

CVPixelFormatType.CV32BGRA

Note that MonoTouch defines this value as 0x42475241 which differs from yours. That could be your error. If not I suggest you to make a small, self contained, test case and attach it to a bug report on http://bugzilla.xamarin.com and we'll have a look at it.

A link to the objective-c sample, if available, would be helpful too (here to an update to your question or on the bug report).