Hi, I’m really sorry for this kind of spam but I really need your help.
The reason why I’m forwarding this message is that I’ve been working for a month on a NEW ACTIVATION FUNCTION that according to my experiments works better tan RELU, ELU, LEAKY RELU, SELU, PRELU, SWISH…
Or any other activation function.
I’ve been paying AWS GPUs for the experiments from my own money and working on this as a side project (I’m 17 and still at High School), but now I need your help in order to publish it on the internet for open access to everyone since Arxiv asks me for endorsements. I think the paper is ready for publishing and we could talk about it more extensively if you wished to.
Any help will be much appreciated.
Thanks in advance,
Eric Alcaide.
Hi, I’m really sorry for this kind of spam but I really need your help.
The reason why I’m forwarding this message is that I’ve been working for a month on a NEW ACTIVATION FUNCTION that according to my experiments works better tan RELU, ELU, LEAKY RELU, SELU, PRELU, SWISH… Or any other activation function. I’ve been paying AWS GPUs for the experiments from my own money and working on this as a side project (I’m 17 and still at High School), but now I need your help in order to publish it on the internet for open access to everyone since Arxiv asks me for endorsements. I think the paper is ready for publishing and we could talk about it more extensively if you wished to.
Any help will be much appreciated. Thanks in advance, Eric Alcaide.