AVAssetExportSession - The video could not be composed

7.1k views Asked by At

I am trying to do some basic Video Compositions in Xamarin / Monotouch and am having some success but am stuck what seems to be a rather simple task.

I record videos from the camera in portrait so I use AVAssetExportSession to rotate the videos. I have created a layer instructions to rotate the video which works fine. I am able to successfully export the video in the correct orientation.

The Issue:

When I add the audio track into the export I always get a failed response with this error:

Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo=0x1912c320 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}

If I dont set the videoComposition property on the exportSession the audio and video export perfectly fine just with the wrong orientation. If anyone could give mem some advice it would be greatly appreciated. Below is my code:

var composition = new AVMutableComposition();
                var compositionTrackAudio = composition.AddMutableTrack(AVMediaType.Audio, 0);
                var compositionTrackVideo = composition.AddMutableTrack(AVMediaType.Video, 0);
                var videoCompositionInstructions = new AVVideoCompositionInstruction[files.Count];
            var index = 0;
            var renderSize = new SizeF(480, 480);
            var _startTime = CMTime.Zero;
            //AVUrlAsset asset;



            var asset = new AVUrlAsset(new NSUrl(file, false), new AVUrlAssetOptions());
            //var asset = AVAsset.FromUrl(new NSUrl(file, false));


            //create an avassetrack with our asset
            var videoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
            var audioTrack = asset.TracksWithMediaType(AVMediaType.Audio)[0];

            //create a video composition and preset some settings

            NSError error;

            var assetTimeRange = new CMTimeRange { Start = CMTime.Zero, Duration = asset.Duration };

            compositionTrackAudio.InsertTimeRange(new CMTimeRange
            {
                Start = CMTime.Zero,
                Duration = asset.Duration,
            }, audioTrack, _startTime, out error);

            if (error != null) {
                Debug.WriteLine (error.Description);
            }

            compositionTrackVideo.InsertTimeRange(assetTimeRange, videoTrack, _startTime, out error);

            //create a video instruction


            var transformer = new AVMutableVideoCompositionLayerInstruction
            {
                TrackID = videoTrack.TrackID,
            };

            var audioMix = new AVMutableAudioMix ();
            var mixParameters = new AVMutableAudioMixInputParameters{ 
                TrackID = audioTrack.TrackID
            };

            mixParameters.SetVolumeRamp (1.0f, 1.0f, new CMTimeRange {
                Start = CMTime.Zero,
                Duration = asset.Duration
            });


            audioMix.InputParameters = new [] { mixParameters };
            var t1 = CGAffineTransform.MakeTranslation(videoTrack.NaturalSize.Height, 0);
            //Make sure the square is portrait
            var t2 = CGAffineTransform.Rotate(t1, (float)(Math.PI / 2f));
            var finalTransform = t2;

            transformer.SetTransform(finalTransform, CMTime.Zero);
            //add the transformer layer instructions, then add to video composition


            var instruction = new AVMutableVideoCompositionInstruction
            {
                TimeRange = assetTimeRange,
                LayerInstructions = new []{ transformer }
            };
            videoCompositionInstructions[index] = instruction;
            index++;
            _startTime = CMTime.Add(_startTime, asset.Duration);

            var videoComposition = new AVMutableVideoComposition();
            videoComposition.FrameDuration = new CMTime(1 , (int)videoTrack.NominalFrameRate);
            videoComposition.RenderScale = 1;
            videoComposition.Instructions = videoCompositionInstructions;
            videoComposition.RenderSize = renderSize;

            var exportSession = new AVAssetExportSession(composition, AVAssetExportSession.PresetHighestQuality);

            var filePath = _fileSystemManager.TempDirectory + DateTime.UtcNow.Ticks + ".mp4";

            var outputLocation = new NSUrl(filePath, false);

            exportSession.OutputUrl = outputLocation;
            exportSession.OutputFileType = AVFileType.Mpeg4;
            exportSession.VideoComposition = videoComposition;
            exportSession.AudioMix = audioMix;
            exportSession.ShouldOptimizeForNetworkUse = true;
            exportSession.ExportAsynchronously(() =>
            {
                Debug.WriteLine(exportSession.Status);

                switch (exportSession.Status)
                {

                    case AVAssetExportSessionStatus.Failed:
                        {
                            Debug.WriteLine(exportSession.Error.Description);
                            Debug.WriteLine(exportSession.Error.DebugDescription);
                            break;
                        }
                    case AVAssetExportSessionStatus.Completed:
                        {
                            if (File.Exists(filePath))
                            {
                                _uploadService.AddVideoToVideoByteList(File.ReadAllBytes(filePath), ".mp4");
                                Task.Run(async () =>
                                {
                                    await _uploadService.UploadVideo(_videoData);
                                });
                            }
                            break;
                        }
                    case AVAssetExportSessionStatus.Unknown:
                        {
                            break;
                        }
                    case AVAssetExportSessionStatus.Exporting:
                        {
                            break;
                        }
                    case AVAssetExportSessionStatus.Cancelled:
                        {
                            break;
                        }

                }
            });
3

There are 3 answers

2
nite On BEST ANSWER

So this was a really stupid mistake it was due to adding the audio track in before the video so the instructions must have been trying to apply the transform to the audio track rather than my video track.

0
bojan On

In my case, it was passing wrong track ids into my implementation of AVVideoCompositionInstructionProtocol. Make sure they are correct. In fact I'm writing the reply for myself from the future, because I had this issue in the past (wrong track ids) I spent some time back then, and now I couldn't figure it out again!

0
onmyway133 On

My problem is that I forget to set the timeRange, it should be like this

let instruction = AVMutableVideoCompositionInstruction()
instruction.layerInstructions = [layer]
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: videoDuration)

Note that AVMutableVideoCompositionInstruction.timeRange 's end time must be valid. It is different from AVAssetExportSession.timeRange

The time range to be exported from the source. The default time range of an export session is kCMTimeZero to kCMTimePositiveInfinity, meaning that (modulo a possible limit on file length) the full duration of the asset will be exported. You can observe this property using Key-value observing.