LumenPallidium / backprop-alts

This repository has implementations of various alternatives to backpropagation for training neural networks.
MIT License
12 stars 2 forks source link

300,000 samples in plot? But MNIST has 60k + a new backprop alt for you to try! #1

Open mi3law opened 1 month ago

mi3law commented 1 month ago

Hi LP!

I stumbled on your repo here since we're building our own alternative to backprop at aolabs.ai, using weightless neural networks.

Unless I'm missing something, MNIST has only 70k samples total (60k training, 10k testing), so I'm not sure how you have 300,000 samples in your Top1-Accuracy vs Samples plot. Did you use EMNIST?

I'd love to show you what we've cooked up applying our WNNs to MNIST, too. Seems like we're much more sample efficient. Let's chat some time!

LumenPallidium commented 1 month ago

Hello! You are correct, I was doing five epochs for each network! That does sound interesting, I'll read some of the papers on your website.