Open evil-shrike opened 1 month ago
ok, this configuration seems to work:
client = google.cloud.logging.Client()
handler = CloudLoggingHandler(client, name=LOGGER_NAME)
handler.setLevel(loglevel)
setup_logging(handler)
logger = logging.getLogger(LOGGER_NAME)
logger.setLevel(loglevel)
but not all message from a Cloud Function are logged with the default background transport. As soon as I changed the transport to the SyncTransport I started seeing my messages:
from google.cloud.logging.handlers.transports import SyncTransport
handler = CloudLoggingHandler(client, name=LOGGER_NAME, transport=SyncTransport)
When it runs locally, it does flush the log queue:
Program shutting down, attempting to send 1 queued log entries to Cloud Logging...
Waiting up to 5 seconds.
Sent all pending logs.
but it seems not to happen while running in Cloud (Run).
The problem with initial code is that it doesn't use setup_logging
(only logger.addHandler(handler)
) and as so the default console handler is left and actually I saw output from it while messages from CloudLoggingHandler didn't appear at all.
Additionally, the reference documentation for CloudLoggingHandler says:
This handler is used when not in GAE or GKE environment.
which is very strange as definitely the handler supports GAE environment.
@evil-shrike What code are you running on the Cloud Function that gives you issues with not seeing all the logged messages?
I'm setting up CloudLoggingHandler with a custom name to have an ability to filter log entries in Cloud Logging. But despite method I use to setup it up it doesn't work -
Given a Cloud Function (deployed as http triggered):
logger.py
main.js
So I create a CloudLoggingHandler with my log name and added it to my Python logger. This is what suggested to do in official doc: https://cloud.google.com/python/docs/reference/logging/latest/handlers-cloud-logging But it doesn't work in Cloud Functions
Here's how my log entry look like:
The logname field is still default stdout.