NSTimer giving inexact results

129 views Asked by At

I have a camera app where I am trying to limit the capture length to exactly 15 seconds.

I have tried two different approaches, and neither of them are working to my satisfaction.

The first approach is to fire a repeating timer every second:

self.timer = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector(countTime:) userInfo:[NSDate date] repeats:YES];

- (void)countTime:(NSTimer*)sender {
    NSDate *start = sender.userInfo;
    NSTimeInterval duration = [[NSDate date] timeIntervalSinceDate:start];
    NSInteger time = round(duration);
    if (time > 15) { 
        [self capture:nil]; // this stops capture
    }
}

this gives me a 15 second video 8/10 times, with a periodic 16 second one... and I have tried a mixture of the NSTimeInterval double and the rounded integer here, with no apparent difference...

The second approach is to fire a selector once after the desired duration, like so:

self.timer = [NSTimer scheduledTimerWithTimeInterval:15.0f target:self selector:@selector(capture:) userInfo:nil repeats:NO];

this just calls the capture method - which stops camera capture - directly, and gives me the same results...

Is there something that I am overlooking here?

Now, because I have tested with a number of tweaked floating point values as the cap (14.5, 15.0, 15.1, 15.5, 16.0 etc) and I almost always see a 16 second video after a few tries, I am starting to wonder whether it's just the AVFoundation taking a second to flush the buffer... ???

2

There are 2 answers

0
jesses.co.tt On

Thanks to Paul and Linuxious for their comments and answers... and Rory for thinking outside the box (intriguing option).

And yes, in the end it is clear that NSTimer isn't sufficient by itself for this.

In the end, I listen for the captureOutput delegate method to fire, test for the length of the asset, and trim the composition appropriately.

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
      fromConnections:(NSArray *)connections error:(NSError *)error
{
    _isRecording = NO;

    AVURLAsset *videoAsset = [AVURLAsset assetWithURL:outputFileURL];
    CMTime length = [videoAsset duration];
    CMTimeShow(length);

    if(CMTimeGetSeconds(length) > 15)
    {
        NSLog(@"Capture Longer Than 15 Seconds - Attempting to Trim");

        Float64 preferredDuration = 15;
        int32_t preferredTimeScale = 30;
        CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(preferredDuration, preferredTimeScale));

        AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:AVAssetExportPresetHighestQuality];
        exportSession.outputURL = outputFileURL;
        exportSession.outputFileType = AVFileTypeQuickTimeMovie;
        exportSession.timeRange = timeRange;

        NSError *err = nil;
        [[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:&err];
        if (err) {
            NSLog(@"Error deleting File: %@", [err localizedDescription]);
        }
        else {
            [exportSession exportAsynchronouslyWithCompletionHandler:^{
                if (exportSession.status == AVAssetExportSessionStatusCompleted) {
                    NSLog(@"Export Completed - Passing URL to Delegate");
                    if ([self.delegate respondsToSelector:@selector(didFinishRecordingToOutputFileAtURL:error:)]) {
                        [self.delegate didFinishRecordingToOutputFileAtURL:outputFileURL error:error];
                    }
                }
                else if(exportSession.status == AVAssetExportSessionStatusFailed) {
                    NSLog(@"Export Error: %@", [exportSession.error localizedDescription]);
                    if ([self.delegate respondsToSelector:@selector(didFinishRecordingToOutputFileAtURL:error:)]) {
                        [self.delegate didFinishRecordingToOutputFileAtURL:outputFileURL error:exportSession.error ];
                    }
                }
            }];
        }

    }

}
0
Paul Cezanne On

NSTimer is not guaranteed to fire when you want it to, just after you want it to fire:

From Apple's docs:

A timer is not a real-time mechanism; it fires only when one of the run loop modes to which the timer has been added is running and able to check if the timer’s firing time has passed. Because of the various input sources a typical run loop manages, the effective resolution of the time interval for a timer is limited to on the order of 50-100 milliseconds. If a timer’s firing time occurs during a long callout or while the run loop is in a mode that is not monitoring the timer, the timer does not fire until the next time the run loop checks the timer. Therefore, the actual time at which the timer fires potentially can be a significant period of time after the scheduled firing time. See also Timer Tolerance.

But to answer your question, I used to work for a company that had a max 15 seconds video. I didn't write the video code but I think we used AVComposition after the fact to ensure that the video was no more than 15 seconds. And even then it could be a frame shorter sometimes. See How do I use AVFoundation to crop a video