RockStarCoders / alienMarkovNetworks

Using MRFs and CRFs for computer vision problems.
21 stars 9 forks source link

Select parameters for labelling by optimising validation set error #10

Closed jsherrah closed 10 years ago

jsherrah commented 10 years ago

for example superpixel params (2), and K in the MRF.

jsherrah commented 10 years ago

Need #22 first

jsherrah commented 10 years ago

At this point we have worked out the optimal classifier. It's a matter of running the MRF on validation set images with different edge weight parameters, K1 and K2.

  1. for K1, K2 in value combinations:
    1. generate MRF(K1,K2) labellings for validation images using labelAllImages.sh (to be written)
    2. use evalPredictions.py to compute validation set accuracy for K1,K2
    3. Select optimal K1,K2 by validation set.
  2. Estimate test set error for these params.
amb-enthusiast commented 10 years ago

Hi Jamie, assumed that since you are already crunching these numbers you'd like the pleasure of closing this issue!

jsherrah commented 10 years ago

yep, I'm trying to resurrect my nohup'd pdb process at the mo

jsherrah commented 10 years ago

Ok I have a first result for this on the validation set:

$ ./evaluateMSRCLabelling.sh /vagrant/results/imagesClassified/validation /vagrant/results/imagesLabelled/validation /vagrant/msrcData/validation /vagrant/features/msrcTraining_slic-400-010.00_adj.pkl Evaluting MRF for K = 0.01 68.4134262012 K = 0.01, average accuracy = 68.4134262012 Evaluting MRF for K = 0.05 69.3001067234 K = 0.05, average accuracy = 69.3001067234 Evaluting MRF for K = 0.1 69.7351743635 K = 0.1, average accuracy = 69.7351743635 Evaluting MRF for K = 0.5 69.8613318329 K = 0.5, average accuracy = 69.8613318329 Evaluting MRF for K = 1.0 66.4329870103 K = 1.0, average accuracy = 66.4329870103 Evaluting MRF for K = 1.5 63.3614810355 K = 1.5, average accuracy = 63.3614810355 Evaluting MRF for K = 2.0 65.5317309179 K = 2.0, average accuracy = 65.5317309179

That's for the degree and adjacency criterion. Seems K is fairly robust between .1 and .5, I could do a higher resolution search around there.

jsherrah commented 10 years ago

./evaluateMSRCLabelling.sh /vagrant/results/imagesClassified/validation /vagrant/results/imagesLabelled/validation2 /vagrant/msrcData/validation /vagrant/features/msrcTraining_slic-400-010.00_adj.pkl

jsherrah commented 10 years ago

vagrant@vagrant-ubuntu-raring-64:/vagrant/alienMarkovNetworks$ ./evaluateMSRCLabelling.sh /vagrant/results/imagesClassified/validation /vagrant/results/imagesLabelled/validation2 /vagrant/msrcData/validation /vagrant/features/msrcTraining_slic-400-010.00_adj.pkl Evaluting MRF for K = 0.1 69.7351743635 K = 0.1, average accuracy = 69.7351743635 Evaluting MRF for K = 0.2 70.3278485638 K = 0.2, average accuracy = 70.3278485638 Evaluting MRF for K = 0.3 70.1312096771 K = 0.3, average accuracy = 70.1312096771 Evaluting MRF for K = 0.4 70.1121038621 K = 0.4, average accuracy = 70.1121038621 Evaluting MRF for K = 0.5 69.8613318329 K = 0.5, average accuracy = 69.8613318329

jsherrah commented 10 years ago

K=0.2 is the winner.