I using kedro in conjunction with databricks-connect to run my machine learning model. I am trained and tested the model using databricks notebook and saved the pickle file of the model to azure blob storage. To test the pipeline locally I downloaded the pickle file and stored it locally. kedro reads in the file fine when stored locally, however the problem is when I try to read in the file into kedro directly from azure; I get the following error : "Invalid Load Key '?'
Unfortunately I cannot post any screenshots since I am doing this for my job. My thinking is that the pickle file is stored differently while on azure and when downloaded locally it is stored correctly.
For a bit more clarity I am attempting to store a logistic Regression model trained used statsmodels (sm.logit)
Pickles unfortunately aren't guaranteed to work across environments, at the very least the python version (and possibly OS) need to be identical. Is there a mismatch here?