Using boto to create a AWS data pipeline for the RedShiftCopyActivity

881 views Asked by At

I am trying to move data from s3 into redshift and want to enforce uniqueness on primary keys in redshift. I realized that the copy command itself can't do this. However, I noticed that the RedshiftCopyActivity available through the AWS data pipeline allows an "OVERWRITE_EXISTING" flag which would allow an enforcement of the primary key in some way at least.

I was wondering whether boto could be used to achieve this and if someone could point me to an example of such a use.

1

There are 1 answers

0
Junren On

I think the boto may be able to create a data pipeline for RedshiftCopyActivity.

Here is the Documentation of how to put definition in data pipeline.

https://boto3.readthedocs.org/en/latest/reference/services/datapipeline.html#DataPipeline.Client.put_pipeline_definition

And here is the Example of definition for RedshiftCopyAcitivty

http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-copydata-redshift-define-pipeline-cli.html