sradc / SmallPebble

Minimal deep learning library written from scratch in Python, using NumPy/CuPy.
Apache License 2.0
119 stars 15 forks source link

Variable.local_gradients indented behavior is unclarified #1

Closed Fazel94 closed 3 years ago

Fazel94 commented 3 years ago

class Variable:
    "To be used in calculations to be differentiated."

    def __init__(self, array, local_gradients=[]):
        self.array = array
        self.local_gradients = local_gradients

this snippet would lead to:

>>> var1 = Variable()
>>> var1.local_gradients.append(1)
>>> var2 = Variable()
>>> print(var2.local_gradients)
'[1]'

This generally could lead to hard-to-track bugs, unless one really is in need of this behavior, it would be advisable to change it or else at least clarify it in the code comments or docs that Variable has a class variable that is shared between instances.

sradc commented 3 years ago

Thank you, good point.

I see two options for the fix:

a)

def __init__(self, array, local_gradients=None):
    self.array = array
    self.local_gradients = local_gradients if local_gradients else []

b)

def __init__(self, array, local_gradients=()):
    self.array = array
    self.local_gradients = local_gradients

a) is consistent with local_gradients being a list when defined in SmallPebble operations. b) is shorter and more readible. Neither provide enough information to know what local_gradients should contain.

I'm going with b), for readability, and because I don't think the inconsistency is too bad. I.e. local_gradients can be a list or a tuple.

I did originally use tuple for local_gradients, but switched to list for readability, e.g. [(a, b)] vs ((a, b),).