How to use Machine Learning model in objective-C with CoreML

6.1k views Asked by At

I have Machine Vision project made in objective-C (a couple of thousands of lines). To complete it I need to import my machine My_model.mlmodel with the latest coreML library. (as digression My_model.mlmodel was created in Python with coremltools ) I am trying to instantiate it, but nothing works I couldn't find any tutorials or helps on this topic. Of course, I imported my model to the pure Swift project, and it works. So I attached Swift class to my project hoping to make it work this way, but here again, Xcode translate the model to "Objective-C generated interface for model" and the model is not visible in Swift class.

Below picture shows that Xcode automatically imports .mlmodel as Objective-C class.

Xcode imports mlmodel as Objective-C class

I need to put vector in the model and get the response.

Please help me; I am stacked a couple of lines from completing this project. How to use My_model.mlmodel inside of Objective-C Is there any work around or maybe straight easy way like in Swift

Thanks a lot.

2

There are 2 answers

2
Matheus Domingos On

This is the way that worked for me using my own image recognition model:

#import <CoreML/CoreML.h>
#import <Vision/Vision.h>
#import "Your_Model_Here.h"    

- (void)readModelMLObjc{

            MLModel *model;
            VNCoreMLModel *m;
            VNCoreMLRequest *request;
            model = [[[Your_Model_Here alloc] init] model];

            m = [VNCoreMLModel modelForMLModel: model error:nil];
            request = [[VNCoreMLRequest alloc] initWithModel: m completionHandler: (VNRequestCompletionHandler) ^(VNRequest *request, NSError *error){
                dispatch_async(dispatch_get_main_queue(), ^{

                    NSInteger numberOfResults = request.results.count;
                    NSArray *results = [request.results copy];
                    VNClassificationObservation *topResult = ((VNClassificationObservation *)(results[0]));
                    NSString *messageLabel = [NSString stringWithFormat: @"%f: %@", topResult.confidence, topResult.identifier];
                    NSLog(@"%@", messageLabel);

                });
            }];

            request.imageCropAndScaleOption = VNImageCropAndScaleOptionCenterCrop;


            CIImage *coreGraphicsImage = [[CIImage alloc] initWithImage:image];

            dispatch_async(dispatch_get_global_queue(QOS_CLASS_UTILITY, 0), ^{
                VNImageRequestHandler *handler = [[VNImageRequestHandler alloc] initWithCIImage:coreGraphicsImage  options:@{}];
                [handler performRequests:@[request] error:nil];
            });}

        }

I hope I've helped ;)

1
Alexandr  Tchausoff On

Maybe, this project on obj-c will help you: https://github.com/freedomtan/SimpleInceptionV3-ObjC/tree/continuous-mobilenet/SimpleInceptionV3-ObjC

In my project I use this method to init my model

#import "my_model.h"

@property (nonatomic, strong) my_model *model;

- (void)configureModel {
    NSURL *modelUrl = [[NSBundle mainBundle] URLForResource:@"my_model" withExtension:@"mlmodelc"];
    NSError *error;
    self.model = [[my_model alloc] initWithContentsOfURL:modelUrl error:&error];
    if (error) {
        NSLog(@"cover search error: model not configure");
    }
}

Some explanations about why "mlmodelc": https://blog.zedge.net/developers-blog/hotswapping-machine-learning-models-in-coreml-for-iphone