Serving custom model from Kubeflow pipeline

1k views Asked by At

I have a kubeflow pipeline which trains custom (i.e. not based on sklearn / tensorflow etc. classes) ml model. Now I would like to add serving at the end of the pipeline. I.e. I want to have a service in my Kubernetes cluster which uses the model to answer prediction requests and this service should be updated with a new model after each pipeline run.

As far as I know to serve a custom model I should:

  1. Wrap my model into kfserving.KFModel class

  2. Create docker image with the wrapper from 1) running

  3. Create InferenceService endpoint with image from 2)

Is there any cloud agnostic way to do this in a Kubeflow component? (so basically the component must be able to build docker images)

Is there some better way to achieve my purpose?

Maybe I should move steps 1-3 outside of pipeline component and just create a component which would trigger external execution of 1-3. Can this be done?

2

There are 2 answers

2
E. Anderson On

I can't speak to Kubeflow in particular, but https://buildpacks.io/ provides a general-purpose way to build containers that satisfy certain input criteria (for example, "is a python program with a main and a requirements.txt"). It's also possible (but more complicated) to create a new buildpack (for example, to take "python code that implements kfserving.KFModel and wrap a main and whatever else is needed around it). I've done this a few times for python for demos/etc:

https://github.com/evankanderson/klr-buildpack https://github.com/evankanderson/pyfun

Note that these aren't production-grade, just me playing around for a day or three.

You can build buildpacks locally with the pack command, or on a cluster using several technologies. There's detailed documentation for 5 build options here: https://buildpacks.io/docs/tools/, along with a longer list of "supported platforms" at the bottom of https://buildpacks.io/features/.

0
Theofilos Papapanagiotou On

You should definitely move steps 1-3 outside of the Kubeflow Pipeline, building the docker images for your custom model server shouldn't be done on every pipeline run.

Having said that, your custom image should load the blessed model of a run from an external source, ie S3/GS/minio bucket.

Are you sure that your model is built on a framework that is not in the list of model servers that Kserve supports already and you need to create a custom model server?