How to get BigQuery Storage Write API logs on a Cloud Function?

190 views Asked by At

I am coding in Python a Cloud Function that writes data to BigQuery. To bypass quota limits, I am trying to use the BigQuery Storage Write API to write my data, following this documentation. The destination BigQuery table has Datetime columns but it seems to not be supported by protobuf so I use timestamps in my stream instead.

I am using cloud logging on Debug on my Cloud Function

logging_client = cloud_logging.Client()
logging_client.get_default_handler()
logging_client.setup_logging(log_level=logging.DEBUG)

When launching my Cloud Function, I get the following error in the function logs: google.api_core.exceptions.Unknown: None There was a problem opening the stream. Try turning on DEBUG level logs to see the error.

How can I get the logs from the BigQuery Storage Write API on a Cloud Function? Moreover, can Bigquery automatically convert timestamps to datetime during the insert?

1

There are 1 answers

2
shollyman On

https://cloud.google.com/bigquery/docs/write-api#data_type_conversions has details about the possible mappings between the protobuf types and the corresponding BigQuery types.

For the datetime case, it's probably simplest to use a string protobuf field, and encode the value as a datetime literal.