mathmanu / caffe-jacinto

This repository has moved. The new link can be obtained from https://github.com/TexasInstruments/jacinto-ai-devkit
116 stars 35 forks source link

Fixed filter bank neural networks #37

Open ghost opened 5 years ago

ghost commented 5 years ago

This type of neural network may work well with your DSP chips: https://github.com/S6Regen/Fixed-Filter-Bank-Neural-Networks

mathmanu commented 5 years ago

Do you have a publication or result to show that this actually works well? Is that a paper where this is published?

ghost commented 5 years ago

Hi Mathew, Sometimes in life you just have to do the work yourself, if you want the rewards. Regards, Sean O'Connor

On Wednesday, August 14, 2019, 12:42:57 PM GMT+8, Manu Mathew <notifications@github.com> wrote:  

Do you have a publication or result to show that this actually works well? Is that a paper where this is published?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

mathmanu commented 5 years ago

I am not motivated enough to spend my energy on this. Because:

“The woods are lovely, dark and deep, But I have promises to keep, And miles to go before I sleep, And miles to go before I sleep.” -Robert Frost, (Stopping by Woods on a Snowy Evening, Lines 15-16)

Let me say that random filters work just fine and you just need to train only the last fully connected layer - it may accelerate the training time significantly. Would you like to try this yourself and get the reward?

ghost commented 5 years ago

Random filters are effectively a locality sensitive hash and have you not looked at the other repositories on my github page? All these random neural networks, extreme learning machines, echo state and reservoir computing are locality sensitive hash associative memory.
It just keeps being rediscovered again and again in different guises. I kinda boiled it down to its essence here: https://github.com/S6Regen/Associative-Memory Dude, I hope you are using the Walsh Hadamard transform with random sign flipping to do the random projections? And not some way that is greater than O(nlog(n)).

ghost commented 5 years ago

I puked out some information about the Walsh Hadamard transform here: https://github.com/FALCONN-LIB/FFHT/issues/26 If I may be equally poetic?