cog-imperial / OMLT

Represent trained machine learning models as Pyomo optimization formulations
Other
257 stars 56 forks source link

Using Scikit-Learn Neural Network with OMLT v1.1 #105

Closed Xinhe-Chen closed 9 months ago

Xinhe-Chen commented 1 year ago

Dear @carldlaird @rmisener ,

We have a question about using OMLT related to some work started by @jalving under DISPATHES last year.

In OMLT v0.3.1, it is possible to use omlt.neuralnet.NetworkDefinition to create an object that can be further processed in OMLT. https://github.com/jalving/dispatches/blob/prescient_verify/dispatches/workflow/run_surrogate_optimization/rankine_cycle_case/read_scikit_to_omlt.py

In OMLT v1.1, omlt.neuralnet.NetworkDefinition is restructured and this function no longer works.

For OMLT v1.1, if it is possible to build a similar function so that we can read scikit-learn neural networks using OMLT. Who is the best person for us to ask about this?

Best regards, Xinhe Chen and @adowling2

jalving commented 1 year ago

Hi @Xinhe-Chen. Apologies it took awhile to get you a response on this. It should still be possible to do this in OMLT v1.1, but perhaps it is not as straight-forward as it was in v0.3.1. The notebook here shows how to create a network definition manually and add layers with activation functions. The new NetworkDefinition is a bit different however as it can accept a grid of inputs as opposed to a vector.

If you want to try doing something manual yourself: have a look at how the keras reader imports sequential keras neural networks.

Eventually we plan to have more standard neural network library interfaces. In the short term, building your own NetworkDefinition is probably your best bet.

Xinhe-Chen commented 1 year ago

@jalving Thank you for your reply! If I understand correctly, the whole process of adding scikit-learn NN will be

  1. read the scikit NN parameters (weights, activation functions)
  2. Manually create the NN hidden layers using DenseLayer.
  3. put them in Pyomo.

I will try this. Thank you so much for your help!