ddbourgin / numpy-ml

Machine learning, in numpy
https://numpy-ml.readthedocs.io/
GNU General Public License v3.0
15.35k stars 3.72k forks source link

Softsign activation function #18

Closed jeffin07 closed 5 years ago

jeffin07 commented 5 years ago

Implemented Softsign activation and unit test for sotfsign plot

files changed

ddbourgin commented 5 years ago

Thanks, @jeffin07! I'm less inclined to keep adding activation functions, since I think we've already covered the majority of nonlinearities used in modern deep learning. If there's a compelling reason to add soft-sign (e.g., a paper / architecture that reports good results using soft-sign), let me know, but otherwise I think we're probably best keeping things as they are.

jeffin07 commented 5 years ago

@ddbourgin i didn't see you closed #7 :) . I was thinking of doing a loss function do you have any suggestions ?

ddbourgin commented 5 years ago

That sounds great! Perhaps a cosine or KL-divergence loss? Depending on your enthusiasm, more sophisticated things like triplet loss or connectionist temporal classification loss would be awesome, though there's a reason why I've put them off ;-)

jeffin07 commented 5 years ago

@ddbourgin thats great will close this PR now :)