ZONE_RESOURCE_POOL_EXHAUSTED for DataFlow & DataPrep

1.6k views Asked by At

Alright team...Dataprep running into BigQuery. I cannot for the life of me find out why I have the ZONE_RESOURCE_POOL_EXHAUSTED issue for the past 5 hours. The night before, everything was going great, but today, I am having some serious issues.

Can anyone give any insight into how to change the resource pool for Dataflow jobs with regard to Dataprep? I can't even get a basic column transform to push through.

Looking forward to anyone helping me with this because honestly, this issue one of those "just change this and maybe that will fix it and if not, maybe a few weeks and it'll work".

Here is the issue in screenshot: https://i.stack.imgur.com/Qi4Dg.png

UPDATE:

I believe some of my issue may deal with GCP Compute incident 18012 espcially since it's a us-central based issue for creation of instances.

1

There are 1 answers

0
Atybzz On

The incident you mentioned was actually resolved on November 5th and was only affecting the us-central1-a zone. Seeing that your question was posted on November 10th and other users in the comments got the error in the us-central1-b zone, the error is not related to the incident you linked.

As the error message suggests, this is a resource availability issue. These scenarios are rare and are usually resolved quickly. If this ever happens in the future, using Compute Engine instances in other regions/zones will solve the issue. To do so using Dataprep, as mentioned in the comment, after the job is launched from Dataprep, you can re-run the job from Dataflow while specifying the region/zone you would like to run the job in.