Open shiffman opened 8 years ago
Perceptron
Wekinator will be great !
Maybe some sort of handwritten glyph detection (https://www.kaggle.com/c/digit-recognizer)? With visualization of neural network with changing weights and glowing synapses and woosh sound while processing inputs? :)
EDIT: I meant recognition, not detection, of course
I haven't had the chance to play with SVM's so what would be my first choice. :)
I have however coded a multilayered perceptron neural network and visualised it in processing. It learns from the popular NMIST handwritten digits data set which is free to download. You can find a pdf I published on issuu with all the visualisation modes including learning rate, activation and sigmoid function.
Oh and the first page in the slide visualises the similarity of the data using a particle-spring system. They tend to form groups and align based on writing style. My approach was loosely based on this very nice post.
@CharlesFr thanks for this post, the suggestions and resources, it's very helpful and I hope to get to this topic soon!
Maybe a Rock Paper Scissor playing Neural Network with visualization of the separate neurons and weights. That would surly be appreciated by the majority of people watching your channel including myself.
Just wanna add this because it is something that I always wanted to learn :)
Agreed with Coderoversially. A NEAT neural Network would be great, maybe combining with the Rockets from The Nature of Code to evolve rockets with their own hidden layer.
Watching CR #53 and neural network libraries with p5js came up... I've been trying to play with this: http://synaptic.juancazala.com/#/
Maby a coding challange (https://github.com/CodingRainbow/Rainbow-Topics/issues/136) at the end of the series (self learning 2048 game)
Any idea when we can expect the stream on this one?
I'm hoping to start this sometime between Jan-April 2017.
In your 'algorithm for classification', suggestions: include sub-items : SVM and tree based models
Flappy Bird would be very cool 👍
For supervised learning: Gradient Descent and the Function Cost? Since you're awesome at explaining math related things in a human approachable way. :) By the way, for everyone interested in this topic, Stanford has a great course on Coursera on Machine Learning!
Have to agree with @VVZen. I do get the perceptron, sigmoid neuron, feed forwarding parts of neural networks, however as soon as it's time to start implementing backpropagation I hit a wall.
I would like to suggest a multiple part Coding Challenge that covers the topics of neural networks, perceptrons, sigmoid neurons, feed forwarding and backpropagation in detail, increasing the level of difficulty with each part. Thanks so much!
Soon maybe it makes sense to close this issue and break it out into separate ones? To follow along with recent plans here is a list of topics on a syllabus I'm prepping to start mid march.
Looks great @shiffman! Looking very much forward to week 3 especially ;-) I guess the challenge is to explain the algorithms involved without diving too deep into matrix calculations. I don't know if it could help you, but I have been working on a matrix calculations library for Processing that might take away a lot of those worries. Just drop me a line if you are interested.
Will be great to see a basic neural network (like one I wrote: https://github.com/egordorichev/tiny-network) learning to play Mario or another simple game!
And also a big playground for interacting with the neural network would be fun. Like you can see all of the neurons, point one with the mouse and see it's weights, output value and inputs...
For anyone following along, this is my in-progress syllabus for my NYU class which is an updated list of topics I plan to cover, please feel free to contribute to the repo anytime!
I would like if you did some videos writing some reinforcement learning from scratch and then some videos using reinforcement learning with libraries like tf.js.