HiveBrain v1.2.0
Get Started
← Back to all entries
debugpythonMinor

Memoizing decorator with retries, part 2

Submitted by: @import:stackexchange-codereview··
0
Viewed 0 times
withmemoizingpartdecoratorretries

Problem

A while ago I asked this question Memoizing decorator that can retry and then promptly forgot about it. I more recently saw Python decorator for retrying w/exponential backoff and wanted to add backoff, but figured I should improve what I already have first. I have taken some of the suggestions of the original, but really I've just completely reimagined it (and if I do say so myself, rather cleverly too). My biggest issue with this is that I'm not sure that my current distinction between "exceptions that get suppressed but not remembered" and "exceptions that get remembered" seems like it might be a bit confusing - is there a friendlier interface here?

``
import functools
from types import MappingProxyType as FrozenDict

class Memoizer:

def __init__(self, retry_times=0, suppressed_exceptions=tuple(), capture_exceptions=False):
self.retry = retry
self.retry_times = retry_times
self.suppressed_exceptions = suppressed_exceptions
self.capture_exceptions = capture_exceptions

def __call__(self, function):
d = {}

@functools.wraps(function)
def wrapper(*args, **kwargs, __forget=False):
key = (args, FrozenDict(kwargs))
if key not in d or __forget:
i = self.retry_times
while i > 1:
try:
d[key] = function(*args, **kwargs)
except self.suppressed_exceptions:
continue
except Exception as e:
if self.capture_exceptions:
d[key] = e
break
raise
else:
break
i -= 1
else:
# If we didn't already break out, the last attempt shouldn't suppress exceptions
d[key] = function(*args, **kwargs)

return d[key]

return wrapper
`

Solution

Your code is broken:

  • __forget=False needs to be before **kwargs.



  • retry is undefined. (But is not used)



  • MappingProxyType is not hashable, and so you can't memoize it.



To fix the problem with MappingProxyType you can create a hashable dict, which seems kinda hacky, but is better than nothing. The other two are easy fixes.

I don't see why you're using a class,
the only benefit that you get from it is that you can hide implementation details in it.
However self. is an annoyance in your current implementation, in my opinion.

Design wise, I still don't get the name retry_times, let's actually retry that amount rather than try that amount of times.

And I'd split the memoizer away from the retry decorator.
Being able to use say functools.lru_cache would be nice.

And so I'd make the retry aspect a function,
removing where it currently adds to the cache.
changing the while loop to a for loop,
and will removing the cache.
This should get you:

def retry(retry_times=0, suppressed_exceptions=tuple(), capture_exceptions=False):
    def inner(function):
        @functools.wraps(function)
        def wrapper(*args, **kwargs):
            for _ in range(retry_times):
                try:
                    return function(*args, **kwargs)
                except suppressed_exceptions:
                    continue
                except Exception as e:
                    if capture_exceptions:
                        return e
                    raise
            else:
                return function(*args, **kwargs)
        return wrapper
    return inner


And now you can use the retry part without the memoizing part!

After this, I'd just make a generic memoizer, without caching the **kwargs.
Say:

def memoize(fn):
    cache = {}
    @functools.wraps(fn)
    def wrapper(*args, _forget=False):
        key = (args)
        if key not in cache or _forget:
            cache[key] = fn(*args)
        return cache[key]
    return wrapper

Code Snippets

def retry(retry_times=0, suppressed_exceptions=tuple(), capture_exceptions=False):
    def inner(function):
        @functools.wraps(function)
        def wrapper(*args, **kwargs):
            for _ in range(retry_times):
                try:
                    return function(*args, **kwargs)
                except suppressed_exceptions:
                    continue
                except Exception as e:
                    if capture_exceptions:
                        return e
                    raise
            else:
                return function(*args, **kwargs)
        return wrapper
    return inner
def memoize(fn):
    cache = {}
    @functools.wraps(fn)
    def wrapper(*args, _forget=False):
        key = (args)
        if key not in cache or _forget:
            cache[key] = fn(*args)
        return cache[key]
    return wrapper

Context

StackExchange Code Review Q#133514, answer score: 4

Revisions (0)

No revisions yet.