CalebEverett / pysgd

Python implementation of gradient descent algorithms.
http://nbviewer.jupyter.org/github/CalebEverett/pysgd/blob/master/pysgd.ipynb
MIT License
1 stars 3 forks source link

How to define a saddle objective function ? #1

Open ktiwari9 opened 7 years ago

ktiwari9 commented 7 years ago

Hi there, I am trying to implement the adam for a saddle function like the one drawn using : https://commons.wikimedia.org/wiki/File:Saddle_point.svg . I am having trouble translating this 3-d function as an objective for your framework. Could you please help out with this ?

CalebEverett commented 7 years ago

Sure, I couldn't get to the link you posted, though. On Fri, Mar 10, 2017 at 11:24 PM Kshitij Tiwari notifications@github.com wrote:

Hi there, I am trying to implement the adam for a saddle function like the one drawn using : https://commons.wikimedia.org/wiki/File:Saddle_point.svg http://url. I am having trouble translating this 3-d function as an objective for your framework. Could you please help out with this ?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/CalebEverett/pysgd/issues/1, or mute the thread https://github.com/notifications/unsubscribe-auth/AIiGQ214i_3mCNNtrJV0IF3xP7rlEIvJks5rkkwggaJpZM4MaITu .

ktiwari9 commented 7 years ago

I think you can copy-paste the link manually then it works. Directly clicking doesnt open it for some reason.

CalebEverett commented 7 years ago

It looks like the Saddle Function is z = x2 - y2 so you could add a new gradient and cost function module that mirrors the form of https://github.com/CalebEverett/pysgd/blob/master/pysgd/objectives/stab_tang.py .

But, without knowing your use case, that function doesn't really have any minima, so it won't converge.

On Sun, Mar 12, 2017 at 7:40 PM, Kshitij Tiwari notifications@github.com wrote:

I think you can copy-paste the link manually then it works. Directly clicking doesnt open it for some reason.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/CalebEverett/pysgd/issues/1#issuecomment-286002649, or mute the thread https://github.com/notifications/unsubscribe-auth/AIiGQ9NtCmeljjG3kT-fWWQOQWCTGGlYks5rlKyCgaJpZM4MaITu .

ktiwari9 commented 7 years ago

Yeah I was trying to add this saddle function to the objective functions folder but I am having trouble understanding what is the format of the grad_fun and cost_fun. I mean the elements of grad_fun for saddle function should be {derivative with respect to x, derivative with respect to y,derivative with respect to z } but in your case you have a ZERO and a theta in the list which confuses me. Also in the cost function, everything is in terms of theta so in saddle function case, theta = [x,y,z] and the objective is still defined in terms of theta ?

What I want to try is to see if Adam or Adagrad get stuck in a loop in this saddle function since there is no minima. Theretically, they should not diverge but this is what I wanted to see that is why I want to add this new objective function.