I'm trying to detect if the user advances their clock while an app is running. I'm currently doing this by comparing how two timers change: [NSDate timeIntervalSinceReferenceDate]
and mach_absolute_time.
The basic algorithm is this:
- At the start of the app, save startUserClock (timeIntervalSinceReferenceDate) and startSystemClock (mach_absolute_time converted to seconds)
- Periodically diff the current values of the timers against their respective start values.
- If the diff's are different (within some margin of error), we should know that the timers are out of sync, which indicates a clock change - theoretically the only time this should be possible is if the user has modified their clock.
However, it seems that mach_absolute_time is growing at a slightly faster rate than timeIntervalSinceReferenceDate. In the short term, this isn't a huge issue, but over time the difference adds up and we start seeing a lot of false positives.
This problem appears to be hardware dependent. I don't see it at all on the iPad 1 I have, but a colleague sees it on his iPad 2 and I see it in the simulator.
I've confirmed that there isn't a problem with my conversion of mach_absolute_time to seconds by replacing it with CACurrentMediaTime (which uses mach_absolute_time under the hood). I've tried changing timeIntervalSinceReferenceDate to other timing methods (ie: CFAbsoluteTimeGetCurrent).
Is there something wrong with my base assumption that the timers should grow at the same rate? My thinking was that unless something is fundamentally wrong they shouldn't get so out of sync - they're both determining time, just starting from different points.
Is there a better way to do this? I need a completely offline solution - we can't assume an internet connection.
You could try using gettimeofday to get the value of the current system time. This returns the time since the epoch.