I am trying to import the custom model(XGBoost) to Model Registry in Vertex AI. I have the data in BigQuery but I need to do some preprocessing before passing it into the model.
How can I import the model so that when I deploy or do batch prediction from the data in BigQuery it will do the preprocessing before doing prediction?
I see Model Registry will search for the model file with specific extension in the target folder(e.g. .pkl). Will they also search for preprocessing steps in specific format?
I know Vertex AI can build pipeline, so I need to learn Kubeflow to solve it? like building the pipeline for preprocessing and feed them to the model.
You could take a look at the Custom Prediction Routine:
https://cloud.google.com/vertex-ai/docs/predictions/custom-prediction-routines#:~:text=Custom%20prediction%20routines%20(CPR)%20lets,building%20a%20container%20from%20scratch.