Dataflow job errors: "'The resource 'projects/<removed>/zones/us-central1-a/disks/<removed>-harness-0' is not ready'

167 views Asked by At

One of our pipelines failed this morning with an error we've never seen before. In addition, we had to manually delete the one VM that was was spun up to cancel/stop the job.

Has anything changed in the Dataflow service that could cause this error?

0    [main] INFO  com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner  - PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 49 files. Enable logging at DEBUG level to see which files will be staged.
2243 [main] INFO  com.<removed>.cdf.dfp.DFPDenormalizationCloudDataFlowJob  - Successfully created cloud dataflow service pipeline
2282 [main] INFO  com.<removed>.cdf.dfp.DFPDenormalizationCloudDataFlowJob  - Last loaded table was found. It will be processed for denormalization: Clicks_06_2015
2282 [main] INFO  com.<removed>.cdf.dfp.DFPDenormalizationCloudDataFlowJob  - Last loaded table was found. It will be processed for denormalization: ActiveViews_06_2015
2282 [main] INFO  com.<removed>.cdf.dfp.DFPDenormalizationCloudDataFlowJob  - Last loaded table was found. It will be processed for denormalization: Impressions_06_2015
2435 [main] WARN  com.google.cloud.dataflow.sdk.Pipeline  - Transform <removed>:<removed>.advertisers2 does not have a stable unique name.  In the future, this will prevent reloading streaming pipelines
2615 [main] WARN  com.google.cloud.dataflow.sdk.Pipeline  - Transform <removed>:<removed>.lineitems2 does not have a stable unique name.  In the future, this will prevent reloading streaming pipelines
2616 [main] WARN  com.google.cloud.dataflow.sdk.Pipeline  - Transform <removed>:<removed>.creative2name2 does not have a stable unique name.  In the future, this will prevent reloading streaming pipelines
2616 [main] WARN  com.google.cloud.dataflow.sdk.Pipeline  - Transform <removed>:<removed>.adunit2site2 does not have a stable unique name.  In the future, this will prevent reloading streaming pipelines
3236 [main] INFO  com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner  - Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
3241 [main] INFO  com.google.cloud.dataflow.sdk.util.PackageUtil  - Uploading 49 files from PipelineOptions.filesToStage to staging location to prepare for execution.
41834 [main] INFO  com.google.cloud.dataflow.sdk.util.PackageUtil  - Uploading PipelineOptions.filesToStage complete: 10 files newly uploaded, 39 files cached
Dataflow SDK version: 0.4.150602
51003 [main] INFO  com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner  - To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/<removed>/dataflow/job/2015-06-11_16_39_02-17130055143605818331
Submitted job: 2015-06-11_16_39_02-17130055143605818331
51004 [main] INFO  com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner  - To cancel the job using the 'gcloud' tool, run:
> gcloud alpha dataflow jobs --project=<removed> cancel 2015-06-11_16_39_02-17130055143605818331
2015-06-11T23:39:02.506Z: Detail:  (b056559940543e6a): Expanding GroupByKey operations into optimizable parts.
2015-06-11T23:39:02.509Z: Detail:  (b056559940543d60): Annotating graph with Autotuner information.
2015-06-11T23:39:02.759Z: Detail:  (b0565599405437a9): Fusing adjacent ParDo, Read, Write, and Flatten operations
2015-06-11T23:39:02.762Z: Detail:  (b05655994054369f): Fusing consumer Impressions_06_2015-ParDoDFP-transform into Impressions_06_2015-BQ-Read
2015-06-11T23:39:02.764Z: Detail:  (b056559940543595): Fusing consumer Impressions_06_2015-BQ-Write into Impressions_06_2015-ParDoDFP-transform
2015-06-11T23:39:02.766Z: Detail:  (b05655994054348b): Fusing consumer ActiveViews_06_2015-ParDoDFP-transform into ActiveViews_06_2015-BQ-Read
2015-06-11T23:39:02.767Z: Detail:  (b056559940543381): Fusing consumer ActiveViews_06_2015-BQ-Write into ActiveViews_06_2015-ParDoDFP-transform
2015-06-11T23:39:02.769Z: Detail:  (b056559940543277): Fusing consumer Clicks_06_2015-ParDoDFP-transform into Clicks_06_2015-BQ-Read
2015-06-11T23:39:02.771Z: Detail:  (b05655994054316d): Fusing consumer Clicks_06_2015-BQ-Write into Clicks_06_2015-ParDoDFP-transform
2015-06-11T23:39:02.818Z: Detail:  (b056559940543987): Adding StepResource setup and teardown to workflow graph.
2015-06-11T23:39:18.614Z: Error:   (5494fb7a460f58a8): Workflow failed. Causes: (20fbc2bb0e7cb0b1): One or more operations had an error: 'operation-1434065943092-518467f1f5b21-8d000d8a-d5cd5762': 'The resource 'projects/<removed>/zones/us-central1-a/disks/dfp-denormalization-job-1-06111639-3db5-harness-0' is not ready'.
2015-06-11T23:39:18.651Z: Detail:  (4fb958a4957733a5): Cleaning up.
2015-06-11T23:40:36.126Z: Error:   (d41cf136c17a5e79): Workflow failed. Causes: (20fbc2bb0e7cb0b1): One or more operations had an error: 'operation-1434065943092-518467f1f5b21-8d000d8a-d5cd5762': 'The resource 'projects/<removed>/zones/us-central1-a/disks/dfp-denormalization-job-1-06111639-3db5-harness-0' is not ready'.
2015-06-11T23:43:05.998Z: Warning: (c5964e114f42988b): Job 2015-06-11_16_39_02-17130055143605818331 is already finishing. Ignoring cancel request.
2015-06-11T23:48:04.715Z: Warning: (cf462c726cde3704): Job 2015-06-11_16_39_02-17130055143605818331 is already finishing. Ignoring cancel request.
2015-06-11T23:50:35.529Z: Warning: Internal Issue (4fb958a495773599): 65177287:8503
748739 [main] INFO  com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner  - Job finished with status FAILED
748740 [main] ERROR com.<removed>.cdf.dfp.DFPDenormalizationCloudDataFlowJob  - Job "dfp-denormalization-job-1434066640362" failed. Job may be retried.
1

There are 1 answers

0
Sam McVeety On

This was a temporary issue with the Google Compute Engine API that has since been resolved. When calling GCE on behalf of the user, Dataflow will attempt to work around any transient errors.