Using outputs from TrainingStep in Sagemaker Pipeline

95 views Asked by At

My TrainingStep works as needed in my sagemaker pipeline, however I can't seem to access it in any way in the future steps of my sagemaker pipeline. I am using ProcessingInputs but I can't seem to properly reference just the outputs and not the model itself as a S3 URI.

process_step = ProcessingStep(
    name=f"{PIPELINE_PREFIX}-EvaluationProcessing",
    processor=evaluation_processor,
    inputs=
        [
          ProcessingInput(source=step_train.properties.ModelArtifacts.S3ModelArtifacts, destination="/opt/ml/processing/model"),
          ProcessingInput(source={INSERT_SOURCE_HERE}, destination="/opt/ml/processing/data")         ],
    outputs=[
        ProcessingOutput(destination=f"{PROCESSING_OUTPUT_URI}", output_name="evaluation", source="/opt/ml/processing/evaluation/evaluation.json"),
    ],
    property_files=[evaluation_report],
    code=f"s3://{WRITE_BUCKET}/evaluation.py"
)

Any idea on what {INSERT_SOURCE_HERE} would be to dynamically reference the output from my training step which is different from my model referenced by step_train.properties.ModelArtifacts.S3ModelArtifacts.

I have tried passing it as a string using f"s3://{WRITE_BUCKET}/{step_train.properties.TrainingJobName.to_string()}/output/output.tar.gz" but that doesn't work.

0

There are 0 answers