Closed galamit86 closed 3 years ago
@galamit86
Code looks fine, however, implicitly returning None
when in fact an Exception occured doesn't seem right.
Looking at your example (line 1399), two exceptions can occur (see docs https://googleapis.dev/python/bigquery/latest/generated/google.cloud.bigquery.client.Client.html#google.cloud.bigquery.client.Client):
When not catching these and passing it on, you don't know what's happening.
I do appreciate it's more code writing try ... except all the time, but it seems like this anti-pattern: https://realpython.com/the-most-diabolical-python-antipattern/
I don't know exactly what a good practice is. After reading up on item 65 in Larkin's effective Python my thought are as follows:
I am on board with general sentiment of being explicit.
Where I would like to deviate from it, is when I'm deliberately (mis)using raised exceptions to check for something, and return None
(or False
) as an indication. That happens in 2 places, one of them is in check_bq_dataset
(see my comment there). The other place is in get_metadata_gcp
(line 269).
In both cases, this isn't really about this new implementation of the log decorator - it's a separate issue, and the code was behaving in this way beforehand. If we are ok with this concept in general, then I will implement returning the error from the decorator after logging, instead of None
, and alter these 2 functions accordingly (as I suggest in the specific comment above).
And finally by changing the decorator's behaviour to end with:
except Exception as e:
# Log exception if occurs in function
logger.exception(f"Exception in {func.__name__}: {str(sys.exc_info()[1])}")
return e
I think we avoid the problems pointed by the antipattern post you've sent, right? Since we actually log the information, and return the error, in fact raising it?
@galamit86 This works for me, thanks. And I agree that the issue with the two functions is something else and minor in any case since they just do some checking.
@dkapitan
Great - small correction on my part - return e
does not do reraise the exception - raise e
does. Pushed updated code.
Also, logging sys.exc_info()[1]
(taken from the medium article), is the same as logging e
, as far as I can see, so I'm using that.
To sum up, our general logging decorator should now behaves like so:
This allows us, as far as I can see, to decorate any python function, without changing its expected behaviour. For example,
to continue relying on raising exceptions ourselves, if we like to, like we do in check_gcp_env
:
def check_gcp_env(gcp_env: str, options: List[str] = ["dev", "test", "prod"]):
if gcp_env not in options:
raise ValueError(f"gcp_env must be one of {options}")
else:
return True
This PR adds standard logging to the library, through the implementation of a general logging decorator + a per-module specific logger.
The implementation is done in 3 files:
logging_conf.toml
is where all logging configurations are provided - we can add and define formatters, handlers and loggers there.logger
is defined:root
, with 2handlers
:file
- aRotatingFileHandler
set toWARNING
level (although as it currently stands, we only useINFO
orERROR
in practice)console
- aStreamHandler
set toINFO
levelhandlers
use the sameformatter
:standard
log.py
- where the decorator is definedutils.py
:logger
is defined within that file, to add additional, non-generic log events.Closes #67
@dkapitan Changes from our last discussion in the issue:
logdec
fromstatline_bq.log
we can use the same logger, namedstatline_bq.utils
, which was already configured inlog.py
, and simply use it to add specific logging events wherever we like. Python is awsome.None
value returned by the decorator in case of anexception
. That, plus ducktyping, replaces some cases which were previously handled bytry... except...
blocks, returningNone
orFalse
. See for examplecheck_bq_dataset
, defined in line 1399, called in line 1550. My concern here is that I am relying on behaviour that is defined outside of the function. Let me know what you think.