Create a report from GCP Cloud SQL logs

467 views Asked by At

I have enabled logging on my GCP PostgreSQL 11 Cloud SQL database. The logs are being redirected to a bucket in the same project and they are in a JSON format.

The logs contain queries which were executed on the database. Is there a way to create a decent report from these JSON logs with a few fields from the log entries? Currently the log files are in JSON and not very reader friendly.

Additionally, if a multi-line query is run, then those many log entries are created for that query. If there is also a way to recognize logs which are belong to the same query, that will be helpful, too!

1

There are 1 answers

0
Iñigo González On BEST ANSWER

I guess the easiest way is using BigQuery.

BigQuery will import properly those jsonl files and will assign proper field names for the json data

When you have multiline-queries, you'll see that they appear as multiple log entries in the json files.

Looks like all entries from a multiline query have the same receiveTimestamp (which makes sense, since they were produced at the same time).

Also, the insertId field has a 's=xxxx' subfield that does not change for lines on the same statement. For example:

insertId: "s=6657e04f732a4f45a107bc2b56ae428c;i=1d4598;b=c09b782e120c4f1f983cec0993fdb866;m=c4ae690400;t=5b1b334351733;x=ccf0744974395562-0@a1"

The strategy to extract that statements in the right line order is:

  1. Sort by the 's' field in insertId
  2. Then sort by receiveTimestamp ascending (to get all the lines sent at once to the syslog agent in the cloudsql service)
  3. And finally sort by timestamp ascending (to get the line ordering right)