I've taken a look around, and there aren't that many talks or examples on inertial navigation for iOS5. I know that iOS5 introduced some very cool sensor fusion algorithms:
motionQueue = [[NSOperationQueue alloc] init];
[motionQueue setMaxConcurrentOperationCount:1];
motionManager = [[CMMotionManager alloc] init];
motionManager.deviceMotionUpdateInterval = 1/20.0;
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXTrueNorthZVertical toQueue:motionQueue withHandler:^(CMDeviceMotion *motion, NSError *error) {
}];
I've taken a look at both videos from WWDC that deal with the block above.
The resulting CMDeviceMotion contains a device attitude vector, along with the acceleration separated from the user induced acceleration.
Are there any open source inertial navigation projects specifically for iOS 5 that take advantage of this new sensor fusion ? I'm talking about further integrating this data with the GPS and magnetometer output to get a more accurate GPS position.
A bonus question: is this kind of fusion even possible from a hardware standpoint? Will I melt my iPhone4 if I start to do 20hz processing of all available sensor data over extended periods of time?
I'm ready to start tinkering with these, but would love to get something more solid to start with than the empty block above :)
Thank you for any pointers!
I am writing an app for scuba divers and hoped to add inertial navigation since GPS and other radio based navigation is unavailable underwater. I did quite a bit of research and found that there is just too much jitter in the sensor data on the iPhone for accurate inertial navigation. I did a quick experiment and found that even when the device is perfectly still, the "drift" due to noise in the signal showed that the device "moved" many meters after only a few minutes. Here is the code I used in my experiment. If you can see something I am doing wrong, let me know. Otherwise, I want my afternoon back!
}