BigQuery Storage WRITE API: Concurrent connections per project for small regions per region

56 views Asked by At

I am using BigQuery Storage WRITE api to load the data to BigQuery. This is used in a Dataflow job run in a batch mode.

I noticed on particular day when we were loading large volumes of data, we exceeded the default value 1000 quota for 'Concurrent connections per project for small regions per region'.

Looking in the monitoring in API/Service details, I can the peak usage ( see screenshot ) which confirms the quota exceed issue.

My question is:

Is there are out of the box monitoring metric that gives away the Dataflow JOB ID that consumed so many connections?

I have checked the Dataflow metrics and BigQuery monitoring, I did not get any metrics that will give me the name of Job that created so many connections.

In particular, I have tried the steps given here

https://cloud.google.com/bigquery/docs/monitoring-dashboard#view_quota_usage_and_limits

but for me I can't see the any of the metric below for selection in the limit value

ConcurrentWriteConnectionsPerProject, ConcurrentWriteConnectionsPerProjectEU, and ConcurrentWriteConnectionsPerProjectRegion

Any pointers will be useful.

enter image description here

0

There are 0 answers