AWS DataPipeline via Cloudformation throws error 'type is not defined in fields'

60 views Asked by At

I'm trying to deploy the Export DynamoDB Table to S3 template via Cloudformation but getting a type is not defined in fields error from Cloudformation. I have a Key with the value of type for all of my PipelineObjects with the exception of the Default ParameterObject so I'm not sure what the error is referring to. Does anyone have any ideas as to what may be going on here? Thanks!

DataPipeline:
Type: AWS::DataPipeline::Pipeline
Properties:
  Name: ddb-export
  ParameterObjects:
    - Attributes:
        - Key: type
          StringValue: String
        - Key: description
          StringValue: Region of the DynamoDB table
        - Key: default
          StringValue: us-west-2
      Id: myDDBRegion
    - Attributes:
        - Key: type
          StringValue: String
        - Key: description
          StringValue: Source DynamoDB table name
      Id: myDDBTableName
    - Attributes:
        - Key: type
          StringValue: Double
        - Key: description
          StringValue: DynamoDB read throughput ratio
        - Key: default
          StringValue: "0.25"
      Id: myDDBReadThroughputRatio
    - Attributes:
        - Key: type
          StringValue: AWS::S3::ObjectKey
        - Key: description
          StringValue: Output S3 folder
      Id: myOutputS3Loc
  Activate: false
  PipelineObjects:
    - Fields:
        - Key: scheduleType
          StringValue: ondemand
        - Key: failureAndRerunMode
          StringValue: CASCADE
        - Key: role
          StringValue: datapipeline-ddb-export
        - Key: resourceRole
          StringValue: datapipeline-ddb-export-resource
      Id: Default
      Name: Default
    - Fields:
        - Key: tableName
          RefValue: "#{myDDBTableName}"
        - Key: type
          StringValue: DynamoDBDataNode
        - Key: readThroughputPercent
          RefValue: "#{myDDBReadThroughputRatio}"
      Id: DDBSourceTable
      Name: DDBSourceTable
    - Fields:
        - Key: type
          StringValue: S3DataNode
        - Key: directoryPath
          StringValue: "#{myOutputS3Loc}/#{format(@scheduledStartTime, 'YYYY-MM-dd-HH-mm-ss')}"
      Id: S3BackupLocation
      Name: S3BackupLocation
    - Fields:
        - Key: type
          StringValue: EmrCluster
        - Key: releaseLabel
          StringValue: emr-5.23.0
        - Key: masterInstanceType
          StringValue: m3.xlarge
        - Key: coreInstanceType
          StringValue: m3.xlarge
        - Key: coreInstanceCount
          StringValue: "1"
        - Key: region
          StringValue: "#{myDDBRegion}"
      Id: EmrClusterForBackup
      Name: EmrClusterForBackup
    - Fields:
        - Key: type
          StringValue: EmrActivity
        - Key: input
          RefValue: DDBSourceTable
        - Key: output
          RefValue: S3BackupLocation
        - Key: runsOn
          RefValue: EmrClusterForBackup
        - Key: resizeClusterBeforeRunning
          StringValue: "true"
        - Key: maximumRetries
          StringValue: "2"
        - Key: step
          StringValue: s3://dynamodb-dpl-#{myDDBRegion}/emr-ddb-storage-handler/4.11.0/emr-dynamodb-tools-4.11.0-SNAPSHOT-jar-with-dependencies.jar,org.apache.hadoop.dynamodb.tools.DynamoDBExport,#{output.directoryPath},#{input.tableName},#{input.readThroughputPercent}
      Id: TableBackupActivity
      Name: TableBackupActivity
Metadata:
  aws:cdk:path: ddb-export-CELL-dev/DataPipeline
1

There are 1 answers

0
stowns On

Turns out the Default pipelineObject needs a type of Default