MilesCranmer / PySR

High-Performance Symbolic Regression in Python and Julia
https://astroautomata.com/PySR
Apache License 2.0
2.33k stars 211 forks source link

Taking Derivatives of Candidate Expression Inside Custom Loss Function #296

Closed gopal-iyer closed 1 year ago

gopal-iyer commented 1 year ago

Hi Miles,

I'm working on a problem in which I would like to define a custom loss function that doesn't just use the predicted values of the candidate expression given the training data ( y_pred | X ), but instead evaluates the candidate expression and its derivatives at a number of new (necessarily unknown at the beginning) points while evaluating the loss function. To draw an analogy with neural networks, assume the NN loss function has its own copy of the entire network at each step of the optimization that it uses to make predictions and take derivatives at a number of inputs previously unseen in the training set. Do you have any thoughts on how easy or tricky it would be to make something like this work? If this sounds too vague/confusing, I'd be happy to connect with you over email to provide more details about the problem. Thanks!