Right now I'm working with Depth camera at iOS since I want to measure distance to the camera of certain points at the frame.
I did all necessary setup in my camera solution and now I have two CVPixelBufferRef
in my hands - one with pixel data and one with depth data.
This is how I fetch both buffers from AVCaptureDataOutputSynchronizer
:
- (void)dataOutputSynchronizer:(AVCaptureDataOutputSynchronizer *)synchronizer didOutputSynchronizedDataCollection:(AVCaptureSynchronizedDataCollection *)synchronizedDataCollection
{
AVCaptureSynchronizedDepthData *syncedDepthData = (AVCaptureSynchronizedDepthData *)[synchronizedDataCollection synchronizedDataForCaptureOutput:depthDataOutput];
AVCaptureSynchronizedSampleBufferData *syncedVideoData = (AVCaptureSynchronizedSampleBufferData *)[synchronizedDataCollection synchronizedDataForCaptureOutput:dataOutput];
if (syncedDepthData.depthDataWasDropped || syncedVideoData.sampleBufferWasDropped) {
return;
}
AVDepthData *depthData = syncedDepthData.depthData;
CVPixelBufferRef depthPixelBuffer = depthData.depthDataMap;
CMSampleBufferRef sampleBuffer = syncedVideoData.sampleBuffer;
if (!CMSampleBufferDataIsReady(sampleBuffer)) {
return;
}
//... code continues
}
Before getting any depth data, I decided to check if dimensions of my buffers align. And I have found out, that my buffer with pixel data
has dimensions 480x640
(vertical, like the orientation of my app) and buffer with depth data
has dimensions 640x480
(horizontal).
Obviously, buffers are different and I can not match pixels to depth values. Do I need to rotate my depth buffer somehow? Is this a known issue?
Please advise how should I solve this problem. Thanks in advance!
Yes, I too see it, Hope this helps. Angle in radians.
caller side: I used angle as: -(.pi/2)