iOS : Audio is missing in exported video

1.1k views Asked by At

I am trying to export the recorded video. And succeed in it. But the audio is missing the final exported video. So I searched for it and added below code for audio.

if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0)
{
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]

                         atTime:kCMTimeZero error:nil];
}

But I am not able to save the video after adding the above code. I am getting an error :

"session.status 4 error Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo=0x17027e140 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}"

- (void)exportDidFinish:(AVAssetExportSession*)session {

NSLog(@"session.status %ld error %@",session.status,session.error);}

Here is the code I used for exporting video. So do you have any idea How can I accomplish my goal of exporting video with audio? Thanks!!

- (void)getVideoOutput{    
exportInProgress=YES;
NSLog(@"videoOutputFileUrl %@",videoOutputFileUrl);
AVAsset *videoAsset = [AVAsset assetWithURL:videoOutputFileUrl];
NSLog(@"videoAsset %@",videoAsset);
// 1 - Early exit if there's no video file selected

NSLog(@"video asset %@",videoAsset);

if (!videoAsset) {

    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Please Load a Video Asset First"

                                                   delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];

    [alert show];

    return;

}



// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.

AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];



// 3 - Video track

AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo

                                                                    preferredTrackID:kCMPersistentTrackID_Invalid];

[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                    ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]

                     atTime:kCMTimeZero error:nil];

/* getting an error AVAssetExportSessionStatusFailed
if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0)
{
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]

                         atTime:kCMTimeZero error:nil];
}*/


// 3.1 - Create AVMutableVideoCompositionInstruction

AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);



// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.

AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

UIImageOrientation videoAssetOrientation_  = UIImageOrientationUp;

BOOL isVideoAssetPortrait_  = NO;

CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;

if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {

    videoAssetOrientation_ = UIImageOrientationRight;

    isVideoAssetPortrait_ = YES;

}

if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {

    videoAssetOrientation_ =  UIImageOrientationLeft;

    isVideoAssetPortrait_ = YES;

}

if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {

    videoAssetOrientation_ =  UIImageOrientationUp;

}

if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {

    videoAssetOrientation_ = UIImageOrientationDown;

}

[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];

[videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration];



// 3.3 - Add instructions

mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];



AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];



CGSize naturalSize;

if(isVideoAssetPortrait_){

    naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);

} else {

    naturalSize = videoAssetTrack.naturalSize;

}



float renderWidth, renderHeight;

renderWidth = naturalSize.width;

renderHeight = naturalSize.height;

mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);

mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];

mainCompositionInst.frameDuration = CMTimeMake(1, 30);


int totalSeconds= (int) CMTimeGetSeconds(videoAsset.duration);

[self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize videoDuration:totalSeconds];



// 4 - Get path

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

NSString *documentsDirectory = [paths objectAtIndex:0];

NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:

                         [NSString stringWithFormat:@"FinalVideo-%d.mov",arc4random() % 1000]];

NSURL *url = [NSURL fileURLWithPath:myPathDocs];



// 5 - Create exporter

AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition

                                                                  presetName:AVAssetExportPresetHighestQuality];

exporter.outputURL=url;

exporter.outputFileType = AVFileTypeQuickTimeMovie;

exporter.shouldOptimizeForNetworkUse = YES;

exporter.videoComposition = mainCompositionInst;


[exporter exportAsynchronouslyWithCompletionHandler:^{


    //dispatch_async(dispatch_get_main_queue(), ^{

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{


        [self exportDidFinish:exporter];


    });

}];

}

1

There are 1 answers

1
Neovov On

I'm not sure if it help, but here is how I did it on a project:

  1. Prepare the final composition

    AVMutableComposition *composition = [[AVMutableComposition alloc] init];
    
  2. Prepare the video track

    AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    
  3. Prepare the audio track

    AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    
  4. Insert the video data from the asset in the video track

    AVAssetTrack *video = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:video atTime:kCMTimeZero error:&error];
    
  5. Insert the audio data from the asset in the audio track

    AVAssetTrack *audio = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:audio atTime:kCMTimeZero error:&error];
    
  6. Then, you can add some instructions to process your video and/or audio data

  7. Finally you should be able to export using:

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
    [exporter exportAsynchronouslyWithCompletionHandler:^{ /* code when the export is complete */ }];
    

Also, check if you properly recorded the audio.
The first time the camera is triggered iOS should ask if you want to allow using the microphone. Check in your device settings if it is allowed.

Another option, you could retrieve your raw asset using the Window > Device window in Xcode.
Select your device and export the data to your computer. Then, locate a recorded asset and open it using VLC for example. Inspect the streams using Cmd + I to see if there is an audio and video track.