Log entries in google cloud logs duplicated the number of times function executed

128 views Asked by At

I am triggering a google cloud function using a Cloud Pub/Sub trigger and Google Cloud Scheduler. Logs are duplicated the number of times the function has run on a given day. I am not sure if it hits a maximum of duplication's.

I have confirmed I don't have any loops causing the issue and tried outputting unstructured logs and didn't see the same duplication.

import logging
from google.cloud import logging as cloudlogging

# Instantiates a client
log_client = cloudlogging.Client()
log_handler = log_client.get_default_handler()
cloud_logger = logging.getLogger("cloudLogger")
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(log_handler)

cloud_logger.error('unable to delete from qb')

I would expect one log per log output from my script each time the script runs.

Example:

2020-10-01 15:25:04.562 AEST Unleashed_Quickbase_Jobs-1yk5flpwmlrll found 

What I actually get is the above log multiplied by the number of times my script has run in the day with nearly identical timestamps see below after my scheduled script has run 3 times.

Unleashed_Quickbase_Jobs-1yk5flpwmlrll found 2020-10-01 15:25:04.564 AEST

Unleashed_Quickbase_Jobs-1yk5flpwmlrll found 2020-10-01 15:25:04.566 AEST

Unleashed_Quickbase_Jobs-1yk5flpwmlrll found 2020-10-01 15:25:04.567 AEST 

I hope this makes it clearer.

0

There are 0 answers