Make Module.trainable_variables also return tf.Variables or (or for pytorch, tensors with requires_grad = True) which are properties of modules + sub-modules as well (and are not necessarily in parameters).
Also allow embedding of tf.Modules (or for Pytorch, nn.Module) and recursively search them for backend variables.
This will mean you can mix probflow parameters + modules with backend variables + modules. For example:
class DenseNetwork(tf.keras.Model):
"""A totally tensorflow-only module"""
def __init__(self, units):
self.layers = [
tf.keras.layers.Dense(units[i+1], input_shape=(units[i],))
for i in range(len(units)-1)
]
def call(self, x):
for layer in self.layer:
x = tf.nn.relu(layer(x))
class NeuralLinear(pf.ContinuousModel):
def __init__(self, units):
self.net = DenseNetwork(units) # tensorflow model!
self.w = pf.Parameter([units[-1], 1]) # probflow parameters
self.b = pf.Parameter([1, 1])
self.s = tf.Variable(tf.random.normal([1, 1])) # tensorflow variable!
def __call__(self, x):
loc = self.net(x) @ self.w() + self.b()
scale = tf.exp(self.s)
return pf.Normal(loc, scale)
And then with recursive variable/model, ProbFlow will also optimize those variables along with the ones in ProbFlow modules/parameters.
Make
Module.trainable_variables
also returntf.Variables
or (or for pytorch, tensors withrequires_grad = True
) which are properties of modules + sub-modules as well (and are not necessarily in parameters).Also allow embedding of
tf.Modules
(or for Pytorch,nn.Module
) and recursively search them for backend variables.This will mean you can mix probflow parameters + modules with backend variables + modules. For example:
And then with recursive variable/model, ProbFlow will also optimize those variables along with the ones in ProbFlow modules/parameters.