I have a very long running python code that calls multiple api's.
I have also implemented logger at each step to identify the exceptions. But, the FileHandler shows the data only on completion of the python script.
Is there a way to flush the logger after a fixed interval ? I could use it for monitoring the status of the job.
# when executed standalone, the main block is called
if __name__ == '__main__':
# Crate logger for RR api application
logger = logging.getLogger('RR_API')
logger.setLevel(logging.INFO)
# create file handler which logs messages
# If the handler is already defined, then no need to set it again
if not logger.handlers:
fh = logging.FileHandler("C:///Users///2128///python_data_logger.log")
fh.setLevel(logging.DEBUG)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fh.setFormatter(formatter)
#add the handlers to the logger
logger.addHandler(fh)
run_ts = datetime.now().strftime('%Y-%m-%d %H:%M:%S')
logger.info('Starting Time' + '|' + run_ts)
print ('Starting Time',datetime.now().strftime('%Y-%m-%d %H:%M:%S'))
count = 0
token = get_temp_token(run_ts)
portfolio = get_orgs(token,'"https://apidata.ratings.com/v1.0/orgs/"',run_ts)
Output in Log File:
2016-12-30 16:22:01,448 - RR_API - INFO - Starting Time|2016-12-30 16:22:01
2016-12-30 16:22:01,526 - RR_API - CRITICAL - No Access Token was downloaded | 2016-12-30 16:22:01
The above lines are visible in log file only when the script execution has terminated.