alpha-davidson / TensorBNN

Full Bayesian inference for neural networks using TensorFlow
https://alpha-davidson.github.io/TensorBNN/
MIT License
15 stars 7 forks source link

Adding activation functions #3

Closed akgopan closed 3 years ago

akgopan commented 3 years ago

Hi I would like to add an exponential activation function to this. Would it be ok if I were to add that to the activation code and push it here so it is available for everyone?

brkronheim commented 3 years ago

I've merged the pull request, though this new code won't be available through the pip install yet. If anyone wants to use this activation through the version on pip they can include the class declaration somewhere in the python file used for training:

class Exp(Layer): """Exponential activation function"""

def __init__(self, inputDims=None, outputDims=None):
    self.numTensors = 0
    self.numHyperTensors = 0
    self.name = "Exp"

def predict(self, inputTensor, _):
    result = tf.math.exp(inputTensor)
    return(result)

Once this is present, simply add the Exp layer like any other activation. To make predictions with it, include the class in the prediction code as well, and simply include the argument customLayerDict = {"Exp": Exp} when initializing the predictor object. This method can be used for any custom layers.