diegocasmo / iis-project

0 stars 0 forks source link

Read Computer Vision facial landmarks and output classification using Emotion Synthesis format #21

Closed ankurshukla03 closed 6 years ago

ankurshukla03 commented 6 years ago

Closes #14 Closes #15

{
  'ANGER': 0.00040305041178978045,
  'DISGUST': 0.2010905945567386,
  'FEAR': 0.0005442277116194078,
  'HAPPY': 0.7968281208888155,
  'SADNESS': 0.0011075836616630938,
  'SURPRISE': 2.6422769373673852e-05
}

I did not receive any mail from kalyan i am planning to send this format as an output to emotion synthesis group and it can be one of the example as its been generated using our MLP classifier model. Check solution notebook in emotion-synthesis branch for the steps.

tristaaan commented 6 years ago

@diegocasmo in the final method, we can just manually select the 15 components PCA deems important? Or do we have to run it still on the single row of data we get.

diegocasmo commented 6 years ago

in the final method, we can just manually select the 15 components PCA deems important? Or do we have to run it still on the single row of data we get.

@tristaaan I'm not sure

tristaaan commented 6 years ago

Great find on the pca.transform I saved the pca.fit(... model and we can load it here. All runs well except the prediction.

I suspect (from experience in the other project) preprocessing is the culprit. Normalization specifically.

x = [[-3,-2,0,2,3], [-10, -8, 0, 8, 10]]
x = np.array(x)
normalize(x, axis=1)
#array([[-0.58834841, -0.39223227,  0.        ,  0.39223227,  0.58834841],
#       [-0.55215763, -0.4417261 ,  0.        ,  0.4417261 ,  0.55215763]])
normalize(x, axis=0)
#array([[-0.28734789, -0.24253563,  0.        ,  0.24253563,  0.28734789],
#       [-0.95782629, -0.9701425 ,  0.        ,  0.9701425 ,  0.95782629]])

axis=1 axis=0 makes quite the difference. We could normalize on the values we've been finding so far in the test set? Or recompute it all with axis=1 (row by row normalization)

tristaaan commented 6 years ago

In the final notebook with axis=1 SVM retains its crown

kNN with PCA classifier accuracy:   0.53%
SVM with PCA classifier accuracy:   0.80%
MLP with PCA classifier accuracy:   0.69%
Random Forest with PCA classifier accuracy:     0.56%
tristaaan commented 6 years ago

Of all the silly coding standards out there one I really like is import ordering for python.

  1. System imports (os, sys, json...)
  2. Third party imports (numpy ...)
  3. Local imports (here constants.py)

Each category should be alphabetically ordered. It makes it a lot easier to read.

ankurshukla03 commented 6 years ago

I am training SVM model with CalibratedClassifierCV to use predict_proba function while giving output to emotion synthesis group as we need to give them the confidence with each of the labels. Just to make sure can you guys check that the above function i am using is not wrong i read it and it seems fine to me, also it doesn't affect the accuracy of our SVM model.

svm = CalibratedClassifierCV(clf)

tristaaan commented 6 years ago

What email client are you using? That link goes to some malware for me, click on it as few times as possible.

-Tristan Wright

On Sun, May 27, 2018 at 11:41 AM, Ankur Shukla notifications@github.com wrote:

I am training SVM model with CalibratedClassifierCV ( http://scikitlearn.org/stable/modules/generated/sklearn.calibration. CalibratedClassifierCV.html) to use predict_proba function while giving output to emotion synthesis group as we need to give them the confidence with each of the labels. Just to make sure can you guys check that the above function i am using is not wrong i read it and it seems fine to me, also it doesn't affect the accuracy of our SVM model.

svm = CalibratedClassifierCV(clf)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/diegocasmo/iis-project/pull/21#issuecomment-392318327, or mute the thread https://github.com/notifications/unsubscribe-auth/AFN5myT1DBTkR9ydMZ1Bs0FY3ZL52zVoks5t2nTBgaJpZM4UKiIn .

ankurshukla03 commented 6 years ago

I am using github itself, not any email client for that comment.

What email client are you using? That link goes to some malware for me, click on it as few times as possible.

tristaaan commented 6 years ago

Lol I'm emailing all of these, that won't be an email client issue. Just check the link, markdown format will make it look nicer. [cool site](http://zombo.com)

ankurshukla03 commented 6 years ago

Lol, Should have looked for the syntax first. Thanks btw.

Just check the link, markdown format will make it look nicer.

tristaaan commented 6 years ago

The link itself though. It goes to the wrong scikit...

http://scikitlearn.org/stable/modules/generated/sklearn.calibration.CalibratedClassifierCV.html
vs
http://scikit-learn.org/stable/modules/generated/sklearn.calibration.CalibratedClassifierCV.html
ankurshukla03 commented 6 years ago

Shitt. Changed the link now.

The link itself though. It goes to the wrong scikit...

tristaaan commented 6 years ago

Is the PCA model different too since we're normalizing differently?

ankurshukla03 commented 6 years ago

Oops. Changing it now

Is the PCA model different too since we're normalizing differently?

tristaaan commented 6 years ago

Cool, I think if the meeting tomorrow goes ok we can make the input changes they're feeding us merge this.

tristaaan commented 6 years ago

Models generated by exporting the data-analysis notebook to a script and running it from the command line with python 2 explicitly.

tristaaan commented 6 years ago

Pretty certain this can be merged now.

diegocasmo commented 6 years ago

Let's wait until tomorrow, minor changes may occur during the meeting. The downgrade of all the notebooks to Python 2 can be done in a different PR.

tristaaan commented 6 years ago

I have no idea what to do with this branch now. Merge away after conflicts resolved?

diegocasmo commented 6 years ago

@tristaaan Don't you need the changes in this branch for the presentation?

tristaaan commented 6 years ago

That's what it's for. Yes I resolved the merge conflict. Merging away.