nickgillian / grt

gesture recognition toolkit
862 stars 284 forks source link

Return weights for every sample? #150

Open antithing opened 6 years ago

antithing commented 6 years ago

I have a model trained, and am using the following to return the likelihood:


        UINT predictedClassLabel = bag.getPredictedClassLabel();
        VectorFloat classLikelihoods = bag.getClassLikelihoods();
        VectorFloat classDistances = bag.getClassDistances();

        std::cout << " ClassLikelihoods: ";
        for (UINT j = 0; j<classLikelihoods.size(); j++) {
            std::cout << classLikelihoods[j] << " ";
        }
        std::cout << " ClassDistances: ";
        for (UINT j = 0; j<classLikelihoods.size(); j++) {
            std::cout << classDistances[j] << " ";
        }

This is working great, giving me, for example:

PredictedClassLabel: 6 ClassLikelihoods: 0 0 0 0 0 1 0 0 0 0 0  
ClassDistances: 0 0 0 0 0 0.390623 0 0 0 0 0

The classification is correct, But, based on the data I have given it, i would expect the distance of some other samples to be non zero also. Is it possible to return the'weights' or distance for every class, not just the classified class?

Or should this be happening already?

Thanks!

antithing commented 6 years ago

.. if I use a pipeline like this:

bool loadResult = trainingData.load("TrainingData.grt");

    //Print out some stats about the training data
    trainingData.printStats();

    //Create a new Gesture Recognition Pipeline using an Adaptive Naive Bayes Classifier
    GestureRecognitionPipeline pipeline;
    pipeline.setClassifier(ANBC());

    //Train the pipeline using the training data
    if (!pipeline.train(trainingData)) {
        std::cout << "ERROR: Failed to train the pipeline!\n";
        return EXIT_FAILURE;
    }

    //You can then get then get the accuracy of how well the pipeline performed during the k-fold cross validation testing
    double accuracy = pipeline.getCrossValidationAccuracy();

///////////////////////////////
    //Perform the prediction    

    bool predictionSuccess = pipeline.predict(inputVector);

    //Get the predicted class label
    UINT predictedClassLabel = pipeline.getPredictedClassLabel();
    VectorFloat classLikelihoods = pipeline.getClassLikelihoods();
    VectorFloat classDistances = pipeline.getClassDistances();

    std::cout << " PredictedClassLabel: " << predictedClassLabel;

    std::cout << " ClassLikelihoods: ";
    for (UINT j = 0; j<classLikelihoods.size(); j++) {
        std::cout << classLikelihoods[j] << " ";
    }
    std::cout << " ClassDistances: ";
    for (UINT j = 0; j<classDistances.size(); j++) {
        std::cout << classDistances[j] / 100 << " ";
    }

It gives me distances of:


 PredictedClassLabel: 4 ClassLikelihoods: 0 0 0 1 0 0 0 0 0 0 0 0  
ClassDistances: -38.3431 -52.1119 -59.9752 -6.21665 -42.6632 -48.6468 -inf -54.2848 -44.8232
-80.4233 -42.3163 -22.5498

where the lowest distance is the correct class. Can I use these numbers as 'weights' for each class somehow? Ideally I need a 0 - 1 value for each.

thanks again.

antithing commented 6 years ago

... Ah, switching to SVM is working for me. Thanks!