AWS Lambda - How to Put ONNX Models in AWS Layers

1.1k views Asked by At

Currently, I have been downloading my ONNX models from S3 like so:

s3 = boto3.client('s3')
if os.path.isfile('/tmp/model.onnx') != True:
    s3.download_file('test', 'models/model.onnx', '/tmp/model.onnx')
inference_session = onnxruntime.InferenceSession('/tmp/model.onnx')

However, I want to decrease the latency of having to download this model. To do so, I am looking to save the model in AWS Lambda layers. However, I'm having trouble doing so.

I tried creating a ZIP file as so:

- python
     - model.onnx

and loading it like inference_session = onnxruntime.InferenceSession('/opt/model.onnx') but I got a "File doesn't exist" error. What should I do to make sure that the model can be found in the /opt/ directory?

Note: My AWS Lambda function is running on Python 3.6.

1

There are 1 answers

0
Marcin On BEST ANSWER

Your file should be in /opt/python/model.onnx. Therefore, you should be able to use the following to get it:

inference_session = onnxruntime.InferenceSession('/opt/python/model.onnx')

If you don't want your file to be in python folder, then don't create layer with such folder. Just have model.onnx in the zip's root folder, rather then inside the python folder.