Closed kevinkjt2000 closed 2 years ago
from functools import wraps as _wraps
def try_x_times(times:int=3):
'''Return a wraper that will attempt to run the function it wraps x times. If it fails the x th time, exception is allowed to go uncaught.'''
def try_wraper(function):
'Wrapper for given function to try times before exit.'
@_wraps(function)
def try_function_wraper(*args, **kwargs):
'Call a function with given arguments, raise exception on failure more than x times.'
exception = None
for attempt in range(times):
try:
return function(*args, **kwargs)
except Exception as e:
exception = e
raise exception
return try_function_wraper
return try_wraper
I'd like to work on this, but I'd first want to clarify how should this be handled,
Initially, I was thinking of something similar as @CoolCat467 however there is a problem which is that most of these functions get the number of tries as an argument, which means we can't access it from a decorator.
@try_times(tries) # we don't have `tries` here
def random_function(tries: int = 3, *a, **kw):
...
These are the ways I was able to think of to solve this issue:
tries
will always be n-th argument of all decorated functions and get it with args[n]
tries
to be keyword-only argumenttries
as fucntion argument and keep it static in the decorator, or as a constantIn my opinion, this is fine as we probably don't need variable amount of tries per each function, we could just store it somewhere as a project-wise constant, perhaps adjustable with environmental variables.
If we need to keep tries
variable, the the keyword-only tries
approach could look like this:
def retry(exceptions = (Exception,)):
"""
Decorator that re-runs given function `tries` times if error occurs.
`tries` value is extracted from the keyword arguments of the decorated function.
If the function fails even after all of the retries, raise the last
exception that the function raised.
"""
def decorate(func):
@wraps(func)
def wrapper(*args, **kwargs):
tries = kwargs.get("tries", 3)
last_exc: Exception
for _ in range(tries):
try:
return func(*args, **kwargs)
except exceptions as exc:
last_exc = exc
else:
raise last_exc
return wrapper
return decorate
@retry(exceptions=(IOError, ConnectionError))
def random_function(*, tries: int = 5, **kw):
...
This will requires changing all involved functions in a way that converts tries
parameter to keyword-only argument, rather than a positional one. This may break some backwards-compatibility since all calls to random_function(5)
wouldn't be working and would now have to be changed to random_function(tries=5)
.
However there is still a pretty big issue with this: since this decorator takes *args
and **kwargs
passed from the user, it won't evaluate the default values for these that are set in the decorated function itself, this means that having tries = 5
by default in the function doesn't actually set the default value for tries to 5, decorator will just check for tries
in **kwargs
and if it's not there, at least in the case above, I've used kwargs.get('tries', 3)
so it would default to 3, not the the 5
that's set in the function.
This means our decorator will always completely ignore the default value defined in the function (5). This could be handled in these ways::
inspect
module, we could extract the function's signature and get the default values, but that is quite complextries
(probably not viable)With option number 3, It could be done with a project-wide constant with the default timeout, since there's probably no need for different default timeouts anywhere. However if we would at any point need to change the default timeout for some individual function, this would be a big issue. We could however simply take the default value from a direct decorator argument (try_times(default_tries=5)
and then later in the function signature too, this is also more explicit which is neat.
In my opinion, If we do want a bit more flexibility and more normal-looking code which uses these decorators, we should go with option number 1. While this would greatly increase the complexity of the code in the decorator itself, the task it performs is fairly clear from the docstring and the name, and whenever it would be used, it would simply magically work and extract the default number from the functions's parameter signature from keywords argumnets.
Example of handling this with the method number 3 (default value as decorator param)
from mcstatus.constatns import DEFAULT_TIMEOUT
@retry(default_tries=DEFAULT_TIMEOUT)
def foo(*, tries: int = DEFAULT_TIMEOUT):
...
Any follow-ups on this? I'd like to implement this but I'm still waiting for decision about how exactly should this be handled, in the post above, I've mentioned some possibilities, but some of them would break backwards-compatibility, and those that wouldn't can be a bit messy to implement.
Why would one need to change the number of tries for only one of the functions? Why not have a global tries count that could be changed if needed?
Why would one need to change the number of tries for only one of the functions? Why not have a global tries count that could be changed if needed?
Because global variables are messy, and people might want different retry counts for different functions.
The functions don't need to have a tries
argument, but can still be made to be backwards compatible with a decorator. Here's what I was imagining when I first posted the issue:
import functools
def retry(tries): # a default could be given here, also support for exceptions=(IOError, ConnectionError) is possible
def outer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
wrapped_tries = kwargs.pop("tries", tries)
exception = None
for attempt in range(wrapped_tries):
try:
return func(*args, **kwargs)
except Exception as e:
exception = e
raise exception
return wrapper
return outer
@retry(tries=5)
def failing_function(counter):
counter[0] += 1
print(counter[0])
if counter[0] != 10 and counter[0] != 15:
raise Exception("I'm a failure")
def main():
counter = [0]
failing_function(counter, tries=10)
assert counter[0] == 10
failing_function(counter)
assert counter[0] == 15
if __name__ == "__main__":
main()
Supporting async would involve checking if the function is a co-routine and running the function to completion in an event loop? I'm not sure on that.
The functions don't need to have a
tries
argument, but can still be made to be backwards compatible with a decorator
Sounds good!
I just wanted to note that getting tries
from kwargs in the wrapped decorator function will work nicely, though it won't ensure complete backwards compatibility! This is because tries
will now need to be a keyword-only parameter, so if anyone was calling foo(5)
instead of foo(tries=5)
the code will stop working after an implementation like this is released.
I don't think this is a huge issue and I love this method of solving the problem, tries
is something that makes sense to be keyword-only anyway since it would be very unclear on what it's doing if it was used as something positional so I don't think this minor backwards compatibility breakage is a huge issue. Even if someone did encounter it, it's very simple to fix and it will make that code cleaner as a benefit.
I just wanted to mention this here just in case someone encounters it and stumbles on this issue, this is the reason why.
tries
could be conditionally grabbed from the positional *args
if we cared about being backward compatible for this parameter. Of course, that would involve some introspection to figure out which index tries
is located in the *args
list. But, since I also agree with it being keyword-only, I'll bump the major version and mention the breakage on that release tag when it happens. No need to go through that much trouble for this pattern cleanup 😄
There are several places where code looks like this:
The retry logic should be moved into a function or decorator that accepts the code to run in the try block and returns the same as what the try block returns. Additionally, this new retry refactoring must be able to support async try blocks too by properly waiting on them to finish.