Open mi3law opened 1 month ago
The bounty has been claimed by an intrepid developer, so it is now closed while their code is in testing and internal review. Please pause any work you are doing towards the bounty-- we want to be respectful of your time and energies.
More details will be shared soon, and in case the submitted code does not work we'll reopen this bounty and notify participants.
Our weightless neural networks framework running on MNIST already achieves great results in terms of training efficiency-- we can get 60-85%+ accuracy from training on <1000 samples (MNIST is usually 60k training samples) with vastly smaller agents than deep learning / CNNs, with the variance due to the effect of different neural connection strategies. Try it at mnist.aolabs.ai or load up the app locally through the repo with Streamlit or Docker.
To get these results, we're also down-sampling MNIST from 255-grayscale to black-&-white, eliminating all 0-255 grayscale pixels values above >200, see the relevant code in our MNIST application here.
Now it is time to run our application against MNIST in full-grayscale to see how that affects training speed and accuracy in pursuit of our mission for more efficient, continuously learning AI. Since the code behind our MNIST efforts is open source, we're opening up this project as a bounty to our community-- we'll pay whoever builds this and recognize them as a contributor!
Bounty Rewards and Bonuses
$500 - extending this app to work on full 255-grayscale MNIST by Oct 27th +$250 bonus - for any further extensions of the app (get creative!) OR for finishing before Oct 27th +$250 bonus - for a replicable +10% increase in accuracy
Project Details & Scope
What does this project entail? To help you scope it out and get started, here are the main changes to the existing code that you'd need to think through, make, and test as part of the work here:
down_sample_item
function - MNIST images are digitized pictures of single handwritten digits, 28 x 28 pixels in size, with each pixel value somewhere between 0, which represents white, and 255, which represents black, and we were down sampling these values to make it B&W.arch_z
output layer, since 4 bits can encode 16 values and there are only 10 output classes in MNIST (0-9)).INPUT
to the AO agent through the application (here and here and here) to match the new larger grayscale input.arr_to_img
function to work with grayscale so that we can visually inspect the agents' binary responses as grayscale images in the same way we can visually inspect the B&W images in the current MNIST app.Ready?
Should you work on this project, the AO team is here to support you throughout your development (addressing questions, providing more context, even pair-programming with you, etc.)-- say hi on discord or meet with our founder Ali here.
To take on this project, please comment on this issue or email eng@aolabs.ai.