In AWS Sagemaker Pipeline, I have a CreateModelStep
that connects to a TransformStep
. When I get to the TransformStep, I see the following error thrown.
Traceback (most recent call last): File "/miniconda3/lib/python3.7/site-packages/sagemaker_containers/_modules.py", line 258, in import_module module = importlib.import_module(name) File "/miniconda3/lib/python3.7/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1006, in _gcd_import File "", line 983, in _find_and_load File "", line 965, in _find_and_load_unlocked ModuleNotFoundError: No module named 'inference'
My steps are defined as follows.
model_step = CreateModelStep(
name='CreateModel',
model=Model(
name='RandomForestModel',
image_uri=training_step.properties.AlgorithmSpecification.TrainingImage,
model_data=training_step.properties.ModelArtifacts.S3ModelArtifacts,
sagemaker_session=sagemaker_session,
role=role
),
inputs=CreateModelInput(
instance_type=instance_type,
accelerator_type='ml.eia1.medium',
)
)
transform_step = TransformStep(
name='Transform',
transformer=training_estimator.transformer(
instance_count=instance_count,
instance_type=instance_type,
accept='text/csv',
env={
'SAGEMAKER_DEFAULT_INVOCATIONS_ACCEPT': 'text/csv',
'SAGEMAKER_USE_NGINX': 'False',
'SAGEMAKER_PROGRAM': 'inference.py',
'SAGEMAKER_REGION': region,
'SAGEMAKER_SUBMIT_DIRECTORY': '/opt/ml/model/'
},
model_name=model_step.properties.ModelName,
output_path=f's3://{default_bucket}/RandomForestTransform'
),
inputs=TransformInput(data=batch_data)
)
From the logs, I can see that a pip install
was done. I think the problem comes from SAGEMAKER_PROGRAM='inference.py'
. Any ideas on what I am doing wrong?
In sagemaker_containers it is failing to import your inference.py. Confirm that you do in fact have an
inference.py
in themodel.tar.gz
/model_data=training_step.properties.ModelArtifacts.S3ModelArtifacts