cog-imperial / OMLT

Represent trained machine learning models as Pyomo optimization formulations
Other
262 stars 57 forks source link

Mixing ReLU and non-linear activation functions #19

Closed fracek closed 2 years ago

fracek commented 2 years ago

We discussed that generating formulations that introduce MIP constraints for ReLU and non-linear constraints for other activations should not be possible. OMLT should throw and exception in this case.

jalving commented 2 years ago

The new formulations as of PR #33 should fix this. ReluBigMFormulation will not allow nonlinear activation functions, but a user can certainly try to mix activations using FullspaceNNFormulation.