Can I use Iguazio to serve a model on a REST API?

96 views Asked by At

Does Iguazio support Flask, FastAPI, or any other framework to serve models? And how do I secure the endpoints?

1

There are 1 answers

0
Marcelo Litovsky On

You can implement ML model serving using MLRun serving and Nuclio runtimes. Both MLRun and Nuclio are integrated into the Iguazio platform. The Iguazio platform includes an API gateway that enables security on the REST endpoints.

MLRun serving can produce managed real-time serverless pipelines from various tasks, including MLRun models or standard model files. The pipelines use the Nuclio real-time serverless engine, which can be deployed anywhere. Nuclio is a high-performance open-source “serverless” framework that’s focused on data, I/O, and compute-intensive workloads.

Here is a snippet of code showing a model deployment. For more details click here

from cloudpickle import load
import numpy as np
from typing import List
import mlrun

class ClassifierModel(mlrun.serving.V2ModelServer):
    def load(self):
        """load and initialize the model and/or other elements"""
        model_file, extra_data = self.get_model('.pkl')
        self.model = load(open(model_file, 'rb'))

    def predict(self, body: dict) -> List:
        """Generate model predictions from sample."""
        feats = np.asarray(body['inputs'])
        result: np.ndarray = self.model.predict(feats)
        return result.tolist()
serving_fn = mlrun.code_to_function('serving', kind='serving',image='mlrun/mlrun')
serving_fn.spec.default_class = 'ClassifierModel'