Unable to add AWS DataPipeline activity using awscli

238 views Asked by At

I have lots of DynamoDB tables to setup backups in Data Pipeline. I am able to pass a json file via aws command line for 1 or 2 tables which means the JSON file is working.

However, when I am passing a large JSON (with 50-100 DynamoDB tables) to setup DataPipeline, I get this error:

An error occurred (InvalidRequestException) when calling the PutPipelineDefinition operation: Web service limit exceeded: Exceeded maximum number of objects allowed per pipeline

I can create each JSON per DynamoDB table. But the problem is the previous DataPipeline gets overwritten by the next JSON.

The AWS command I am using is:

aws datapipeline put-pipeline-definition --pipeline-id df-XXXXXXXXXXXXX --pipeline-definition file:///home/varun/Desktop/df-XXXXXXXXXXXXX.json

My main question: Is there any way, not to overwrite the datapipeline activity, if I am using put-pipeline-definition with multiple JSONs ?

Edit: 1. I have a data pipeline as shown below Test Data Pipeline

  1. The below are the data nodes and activity (backup) inside the data pipeline: Activities and Data Nodes Activities and Data Nodes

I have to create multiple (read ~50) activities and data nodes using JSON. The json works for 1 activity, but for the second one, it overwrites the existing one.

1

There are 1 answers

1
Brian R Armstrong On

For each json you need to create a separate pipeline:

aws datapipeline create-pipeline --pipeline-name mytable --unique-id mytable 
aws datapipeline put-pipeline-definition --pipeline-id <ID from previous command> --pipeline-definition file://mytable.json