hunar4321 / RLS-neural-net

Recursive Leasting Squares (RLS) with Neural Network for fast learning
MIT License
52 stars 9 forks source link

RLS autoencoder #4

Open snapo opened 12 months ago

snapo commented 12 months ago

Hi, Did you somehow figure out how it would be possible to create a auto encoder with RLS? for example with the mnist dataset to remove noise OR create new numbers....

normaly the autoencoder does something like 784 -> 256 -> 784 for either compression or to create new images if one starts from the 256 hidden layer. Is this somehow possible?

hunar4321 commented 11 months ago

This is difficult to compete with the standard autoencoder approach because with the current RLS approach we can only fine-tune one layer of the weights. You can do this: 784 -> 784-> 784 1- Make a random projection of the 784 inputs into 784 nodes (or more) in the first layer. 2 - Add a non-linear activation function like relu or tanh 3 - From the output of the non-linear layer use RLS to map this non-linear output into the 784 outputs where the output = input For better performance increase the number of the neurons in the middle layer i.e more than 784 but this can be computational intensive because RLS has O(n2) complexity.

snapo commented 11 months ago

thats a pretty good idea :-) only O(n2) will "kinda" be a problem :-)

Thanks for sharing...