saltzmanj / keyboardguys

1 stars 0 forks source link

Test ML Algorithms #11

Open saltzmanj opened 7 years ago

saltzmanj commented 7 years ago

Date is flexible. The task on the Gantt chart says "Implement Algorithm": I suppose this would be the first step.

saltzmanj commented 7 years ago

I implemented a backpropogation neural net here in python, and currently am testing it on the MNIST dataset of handwritten digits.

saltzmanj commented 7 years ago

Results for hidden layer = 25, max_optim = 30 image

image

saltzmanj commented 7 years ago

Here's the learning curves with max_iter =50 image

saltzmanj commented 7 years ago

This is with max_iter = 100... not looking any better image

saltzmanj commented 7 years ago

Decreasing lambda to .01 seemed to clean up the high bias issue a bit image

arlieu commented 7 years ago

optics-script.zip

I tested my script on some random matrices in a text file and it works fine. When I use PNG images that I convert to bmp's, the matrices are being formed, but the OPTICS analyses are timing out for some reason. If we can figure out why it's timing out though, determining note-type's should be straightforward.

arlieu commented 7 years ago

optics-script-easy-implementation.zip

Found an easy implementation of OPTICS. It works with Python 2.4 and 2.5. The only thing that needs to be changed is the calculation for distance, which is currently being done through hcluster package.

arlieu commented 7 years ago

I used the hand-drawn notes that @saltzmanj posted under 'Segmentation' Issues for test cases. So far, the clustering script I wrote is able to distinguish between quarter notes and half notes. It also works on computer generated whole notes, but I have not tested it on hand-drawn ones.

Hand-Drawn-Tests.zip note_recognition_test1.pdf

saltzmanj commented 7 years ago

Nice! What threshold does it use for black/white? Is it totally binary?

On Wed, Oct 26, 2016 at 2:10 PM, Avery Lieu notifications@github.com wrote:

I used the hand-drawn notes that @saltzmanj https://github.com/saltzmanj posted under 'Segmentation' Issues for test cases. So far, the clustering script I wrote is able to distinguish between quarter notes and half notes. It also works on computer generated whole notes, but I have not tested it on hand-drawn ones.

Hand-Drawn-Tests.zip https://github.com/saltzmanj/keyboardguys/files/554112/Hand-Drawn-Tests.zip note_recognition_test1.pdf https://github.com/saltzmanj/keyboardguys/files/554116/note_recognition_test1.pdf

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/saltzmanj/keyboardguys/issues/11#issuecomment-256431393, or mute the thread https://github.com/notifications/unsubscribe-auth/AGQmPgeZBakGtlzidopAampgVazzd_qFks5q35eYgaJpZM4KCHsi .

arlieu commented 7 years ago

Takes an RGB image and compares the tuple directly. Threshold at >= (50, 50, 50).

arlieu commented 7 years ago

I cleaned up the note recognition and posted it to the code. It recognizes blank spaces and bar lines now, and the script has been cleaned up for easier translation into LabVIEW. Also, I posted the test cases I used, so hopefully we can get similar images after we apply the segmentation.