Closed kartikdutt18 closed 1 year ago
Hi @kartikdutt18 thats a very good initiative to take. As I am currently working upon the soft shrink function and have almost completed the stuff there, is it ok if i take the Hard Shrink Function also?
Hi, @kartikdutt18 I'll work on the implementation of CELU. Is it okay to move forward with it?
Hi @kartikdutt18 thats a very good initiative to take. As I am currently working upon the soft shrink function and have almost completed the stuff there, is it ok if i take the Hard Shrink Function also?
Feel Free to do so.Thanks
Hi, @kartikdutt18 I'll work on the implementation of CELU. Is it okay to move forward with it?
Feel Free to do so. Thanks.
I'll be working on 7) ISRU functions.
I'll be working on 7) ISRU functions.
Great. Thanks.
Is ELU implemented? If not I'll work on it.
Hi @gaurav-singh1998, ELU function is implemented in src/mlpack/methods/ann/layer folder of mlpack.
@kartikdutt18 thanks for opening the issue, just added some tags, so perhaps we can remove the Good First Issue
from the title.
Hi @zoq, I have removed them. Thanks.
I'll be picking up 9) ISRLU.
@kartikdutt18 Looking at the activation functions, SELU is already implemented:
@zoq, Sorry I missed that, I will remove it from the list.
I will work on Inverse Square Root Linear if no one is doing that
@kartikdutt18 since ISRLU has different values for different ranges, should I put if statements to handle the different ranges?
Hi @PranavReddyP16, Yes that should be fine. You can see softplus or soft sign or rectifier or identity function for an example. I think they all use if conditions. Thanks.
Hi, @kartikdutt18, I would like to work on the implementation of Inverse Square Root Linear. Can I work on it ?
Hi @codeboy5, I think @PranavReddyP16 is working on that. Thanks. ( I will add some other good first issues that will require help, if you want to contribute you can work on it or find some other interesting issue.) Thanks.
@kartikdutt18 Thanks please do that. Is any activation function still unoccupied ? I recently about swish activation function, has someone implemented that previously?
Hi @codeboy5, I don't think so. And yes swish has already been implemented. I think the above list contains all activation functions that haven't been implemented.
@kartikdutt18 Thanks a lot. Please add some more issues to work on
Will do, @ojhalakshya and I are currently compiling another list of functions that need to be added, hopefully it will be posted by today or tomorrow.
Hi @kartikdutt18 is it ok if I work on ISRLU activation function.
Hi @prince776, I think @PranavReddyP16 has implemented but there was some mix up in branches so there isn't a PR open. I think he could give you a better reply.
Hi, @PranavReddyP16 have you implemented/planning to implement ISRLU?
Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly
On Mon, Feb 17, 2020, 5:11 PM Prince Gupta notifications@github.com wrote:
Hi, @PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .
Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly … On Mon, Feb 17, 2020, 5:11 PM Prince Gupta @.***> wrote: Hi, @PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .
Ok, then I'll leave this to you. Good luck with it.
Thanks :)
On Mon, Feb 17, 2020, 5:16 PM Prince Gupta notifications@github.com wrote:
Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly … <#m366435399760716453> On Mon, Feb 17, 2020, 5:11 PM Prince Gupta @.***> wrote: Hi, @PranavReddyP16 https://github.com/PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#2181 https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .
Ok, then I'll leave this to you. Good luck with it.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZECXZP4RXPIK7733E3RDJ2KJA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6D7WQ#issuecomment-586956762, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZFM6ZPMA6HYRMAMUCDRDJ2KJANCNFSM4KQ5A7WQ .
Hi, @birm can I work on adding bipolar sigmoid activation function. I've seen it in a paper and some class slides on neural networks??
The benefit of an open source project such as this is that you don't have to ask permission! :) If you want to, just put in a PR and let us know when you'd like it to be reviewed.
I would like to add a function too, but I don't quite understand the format of activation function files present. Any help will be greatly appreciated as I am new here and want to contribute!
I would like to add a function too, but I don't quite understand the format of activation function files present. Any help will be greatly appreciated as I am new here and want to contribute!
There are two ways in which we implement activation functions.
1) Activation functions that have no hyperparameters, they are added in mlpack/src/mlpack/methods/ann/activation_functions/
You can take a look at any of these to get an idea of how they are implemented (I'll suggest tanh_function.hpp
since most of us know how tanh works in forward and backward propagation.)
2) Activation functions that have some hyperparameter. These are added as a layer in mlpack/src/mlpack/methods/ann/layer/
. You can take a look at elu.hpp
as it is pretty simple.
After implementing the class itself, we also have to add tests in mlpack/src/mlpack/tests/activation_functions_test.cpp
for the corresponding function.
Hope this helps.
is sigmoid done?
I think we have covered all functions. Have a look at the codebase, and see if you can find something that I missed. We would be more than happy to add it.
is sigmoid done?
Yes.
@gauthampkrishnan You can do a simple search on the repository, it would give you an idea whether any function is implemented or not
@codeboy5 , It might be a better idea to look at the codebase, since some functions such as selu are implemented as aliases.
Can I still work on this?
Hi @AniTho,I think all activations and loss functions have an open PR.If you find a function that is missing, add it to list and open a PR for the same.
Okay @kartikdutt18 I will go through the code and find which all have been implemented and see what else I can add in the code.
Is softmax activation function done as I can't find any under the methods/ann/activation_functions
?
There is a PR open for that by a member. You can also search PRs to go through which functions are being worked. I think we have covered all but see if you find any.
There's one but I think no work is done on that since 2019. @codeboy5 has proposed to implement it but I don't think he has done it?
I am not sure if we need another PR for the same issue. We can wait for some reply. If you want there are various other issues open that need help that you could work on. Or get familiar with the code base.
@abinezer I already have a PR open for that and I plan to fix it up soon. Right now I'm a little busy with exams
cool @PranavReddyP16 !!
@gaurav-singh1998 are you done with CELU? I would like to work on it.
Hey @saraansh1999, there is PR opened for it by me. Look at #2191. Thanks
Hey @kartikdutt18 , I was looking for Parametric ReLU activation but couldn't be able to find it. If it is not yet been implemented I would like to work on implementing it. I think this will be good as it allows negative slope to be learned unlike Leaky ReLU. Thanks
Hey, @AbhiSaphire parametric ReLU has been implemented take a look at this. Thanks.
Hi everyone, I have compiled a list of all activation functions that currently not implemented in mlpack but have can be found in either tensor flow or pytorch.
SELUI might have missed some functions, feel free to add them to list. If any one would like to taken up the above functions, please feel free to do so. I hope this is okay with members of the organisation, This was done in order to reduce effort in finding unimplemented functions as well as bring all add State of art activation functions to mlpack. In case I missed something or added an activation that has already been implemented, please forgive me. Thanks.