shogun-toolbox / shogun

Shōgun
http://shogun-toolbox.org
BSD 3-Clause "New" or "Revised" License
3.03k stars 1.04k forks source link

Implement basic backpropagation training algorithm (Deep learning project) #1975

Closed lisitsyn closed 10 years ago

lisitsyn commented 10 years ago

Backpropagation is an important technique used for training of neural networks. This task could be started with some basic algorithm and then we'd be able to generalize and refine our design decisions.

Please join the discussion before starting working on any code. We expect to refine the task with further discussion.

khalednasr commented 10 years ago

I'll work on this one :), anything I should keep in mind?

lisitsyn commented 10 years ago

What is important is how to design it to be flexible and generic.

Theo, could you please share your thougths on this?

ghost commented 10 years ago

Hi Sergey,

I'm also interested in this entrance task. Could I try that? Not sure if I'm too late to discover this, but I can start right now, and further implement CNN in this issue (https://github.com/shogun-toolbox/shogun/issues/1974). By the way, a naive question: Is this task relatively independent on other parts of Shogun? I mean is it okay to firstly focus on implementing NN and then consider how to connect codes with the entire Shogun project. One more question, is there any existing code relevant to deep learning project now? If not, we can start coding from scratch :)

lisitsyn commented 10 years ago

Could I try that?

Yeah sure. I think work on this task could be somehow distributed.

Not sure if I'm too late to discover this, but I can start right now, and further implement CNN in this issue (#1974).

We actually broke it down so it makes sense to first implement some building blocks.

By the way, a naive question: Is this task relatively independent on other parts of Shogun? I mean is it okay to firstly focus on implementing NN and then consider how to connect codes with the entire Shogun project.

Not really - I believe it would be better to start with implementing this as Shogun classes. Just feel free to ask if you have any questions about that.

One more question, is there any existing code relevant to deep learning project now? If not, we can start coding from scratch :)

No, we don't have anything on that yet.

achintp commented 10 years ago

Hey Sergey,

I'm interested in attempting to do this as well, could I have a go at it? Also would implementing convolution be done after the basic blocks (bp, feed forward) are in place?

Also could you elaborate a bit in what you had in mind when you said flexible and generic?

Thanks!

rsong6 commented 10 years ago

Hi Sergey, I'm also interested in working on this task!

lisitsyn commented 10 years ago

Sorry for some delay.

I'm interested in attempting to do this as well, could I have a go at it?

We have nothing on that yet so it is unlikely that we get some clash. So, anyone interested - just do it ;)

Also would implementing convolution be done after the basic blocks (bp, feed forward) are in place?

I think they are not really coming together. It is ok to develop some convolution code independently from anything related to NNs.

ghost commented 10 years ago

Hi Sergey,

Thanks for your reply. I already implemented BP algorithm with Eigen and just pull a request (https://github.com/shogun-toolbox/shogun/pull/2031), while still have some problems as described there. Could you please take a look? Thanks :)

vigsterkr commented 10 years ago

@khalednasr can we close this?

khalednasr commented 10 years ago

@vigsterkr yup!