samsinai / FLEXS

Fitness landscape exploration sandbox for biological sequence design.
https://flexs.readthedocs.io/en/latest/index.html
Apache License 2.0
156 stars 21 forks source link

Global epistasis model isn't monotonic #54

Open an1lam opened 2 years ago

an1lam commented 2 years ago

The current implementation of the global epistasis model doesn't guarantee monotonicity between the output of the initial linear layer and the final output. In my experience, the easiest way to guarantee monotonicity is to transform the weights in the nonlinear layer to non-negative values using a softplus or something similar. (I've tried it using torch but assume it will work similarly well in keras.)

an1lam commented 2 years ago

As an example of a non-monotonic function the current setup can learn, consider a simplified version of your nonlinear function:

# Output of linear layer for three inputs
l = [[-1], [0], [1], [2]]
# Project out to k=2 dimensions + Relu
w_0 = [[2, -2]]
# Output: [[0, 2], [0, 0], [2, 0], [4, 0]]
# Reduce back down to 1 dimension
w_1 = [[.75], [2]]
# Output (squeezed): [2, 0, 1.5, 3]

Clearly this is not monotonic and all we did was use a single hidden layer + ReLU for the non-linearity.