I am reencoding video using AVAssetExportSession
and I want to try and keep the resulting file size below a limit. My final call looks like this:
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality])
{
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
exportSession.fileLengthLimit = 600000;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.videoComposition = mainCompositionInst;
NSLog(@"bytes = %lli", exportSession.estimatedOutputFileLength);
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusFailed:
NSLog(@"Export failed: %@ : %@", [[exportSession error] localizedDescription], [exportSession error]);
handler(nil);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export canceled");
handler(nil);
break;
default:
handler(outputURL);
break;
}
}];
}
however estimatedOutputFileLength always returns 0 and fileLengthLimit seems to be totally ignored. I wanted to use estimatedOutputFileLength to determine whether to use Medium or Low quality encoding.
Could this be an iOS bug? Has anyone got these 2 properties to work?
Adding a note for posterity, regarding the exstimatedOutputFileLength always returning 0, I had to add a timeRange to the export session.