I'm taking a CVImageBufferRef from the camera output, then converting it to a CGImageRef with VideoToolbox, then converting that to a UIImage.
Weird thing is.... when I check the size of the UIImage, it's printing out the pixel width NOT the point width.
aka... CGImageGetWidth(image) == imageToCrop.size.width
Clearly this is not correct, as the size of a UIImage is supposed to be in points.
https://developer.apple.com/documentation/uikit/uiimage/1624105-size?language=objc
The logical dimensions of the image, measured in points.
Running iOS 11.1
CVImageBufferRef lastPixelBuffer; // assume this is filled already
...
CGImageRef image;
(void)VTCreateCGImageFromCVPixelBuffer(lastPixelBuffer, NULL, &image);
UIImage *imageToCrop = [UIImage imageWithCGImage:workingImage];
NSLog(@"image.width = %zu", CGImageGetWidth(image));
NSLog(@"photoToCrop.size.width = %f", photoToCrop.size.width);
NSLog(@"photoToCrop.scale = %f", photoToCrop.scale);
NSLog(@"underlying CGimageRef, width = %zu", CGImageGetWidth(photoToCrop.CGImage));
2017-11-02 16:16:00.895449-0400 TestBed[3950:1202023] image.width = 1080
2017-11-02 16:16:00.895515-0400 TestBed[3950:1202023] photoToCrop.size.width = 1080.000000
2017-11-02 16:16:00.895558-0400 TestBed[3950:1202023] photoToCrop.scale = 1.000000
2017-11-02 16:16:00.895663-0400 TestBed[3950:1202023] underlying CGimageRef, width = 1080