happynear / FaceVerification

An Experimental Implementation of Face Verification, 96.8% on LFW.
447 stars 234 forks source link

LFW with JB #15

Open m10303430 opened 8 years ago

m10303430 commented 8 years ago

HI

I want to test LFW with JB. I refer your code (https://github.com/happynear/FaceVerification/blob/master/test_lfw.m)

But i feel some weird part with 10-fold cross. testing = 10; tmp = pairlist_lfw.IntraPersonPair; tmp((testing-1)_300+1:testing_300,:) = []; train_Intra = tmp; idx = [idx;tmp(:)]; tmp = pairlist_lfw.ExtraPersonPair; tmp((testing-1)_300+1:testing_300,:) = []; train_Extra = tmp;

testing = 8; test_Intra = pairlist_lfw.IntraPersonPair((testing-1)_300+1:testing_300,:); test_Extra = pairlist_lfw.ExtraPersonPair((testing-1)_300+1:testing_300,:);

train and test data are overlapping. It is correct? Or i should refer your other file? (lfwJB.m ,test_lfw_validate.m) THANK!

happynear commented 8 years ago

This line

tmp((testing-1)300+1:testing300,:) = [];

deletes the testing features out.

m10303430 commented 8 years ago

抱歉我可以打中文麻?

可是10-fold cross 不是就要將樣本分為10份,9份拿去訓練1份拿去測試?把那一行刪除不就等於把10份都拿去訓練?

happynear commented 8 years ago

对,10-fold就是这个意思。 如果把

tmp((testing-1)300+1:testing300,:) = [];

删掉,就是全部拿去训练了。

m10303430 commented 8 years ago

那全部拿去訓練,這樣就不是10-fold cross了不是麻?不是應該拿9份去訓練?(如果把那行刪除,不就等於是10份都拿去訓練??)

還是是我對10-fold cross觀念有錯誤呢

happynear commented 8 years ago

按我的代码就是标准的10-fold cross-validation。 你可以自己在matlab里试一下A(range) = [];的作用,就明白了。

m10303430 commented 8 years ago

所以照您說的把 tmp((testing-1)300+1:testing300,:) = []; 刪除就對了是麻?

happynear commented 8 years ago

tmp((testing-1)300+1:testing300,:) = []; 要留着啊,这行代码的意思是把(testing-1)300+1:testing300部分的feature从tmp里删除掉。

m10303430 commented 8 years ago

是的 留著但是底下test的時候不就要是把這(testing-1)300+1:testing300部分的feature進行測試? 所以上下的testing必須一樣 testing = 10; tmp = pairlist_lfw.IntraPersonPair; tmp((testing-1)300+1:testing300,:) = []; train_Intra = tmp; idx = [idx;tmp(:)]; tmp = pairlist_lfw.ExtraPersonPair; tmp((testing-1)300+1:testing300,:) = []; train_Extra = tmp; 到這裡的意思不是從2701~3000的featrue刪除?

再這裡測試的時候不是要對2701~3000進行test? 所以底下的testing不能等於8要等於10? testing = 8; test_Intra = pairlist_lfw.IntraPersonPair((testing-1)300+1:testing300,:); test_Extra = pairlist_lfw.ExtraPersonPair((testing-1)300+1:testing300,:);

happynear commented 8 years ago

哦对,要把testing = 8;这句删掉,上下保持一致。

happynear commented 8 years ago

而且要做10次,取平均。 for一下就好了。

m10303430 commented 8 years ago

是的,我就是改成上下一致,但是performance很差,想請教您怎麼會這樣

m10303430 commented 8 years ago

不好意思我還想請問您,用CASIA data的話 siamese中data是如何創造的?

happynear commented 8 years ago

用dataset里的那几个脚本生成list就好了。 performance的话,这个不是一日之功啊。

ghost commented 8 years ago

”做10次取平均,for一下就好了“ 我的理解是需要做10次,每次要从这10部分中随机选一部分作为test,剩下的9部分作为train,然后对这10次做平均,不是简单的重复10次吧?

happynear commented 8 years ago

@pbypby 每次改testing那个变量呀。

getengqing commented 8 years ago

请问这几个.mat的数据集是从哪来的?是训练出来的还是下载的?

ghost commented 8 years ago

@getengqing 从JB作者的论文主页可以下载到

m10303430 commented 8 years ago

我用您提供的 lbp_lfw.mat 這個特徵去跑您提供的JB+SVM,得到的結果也只有54%,請問有哪裡做錯了麻?

getengqing commented 8 years ago

可以给个JB作者的论文主页链接吗?谢谢

tiankong1993 commented 7 years ago

请问你找到链接了吗 @getengqing

tiankong1993 commented 7 years ago

mat文件需要从哪里下载呢请问一下 @m10303430

tiankong1993 commented 7 years ago

可以给个lbp_lfw.mat的下载链接吗,找了好久都没找到,打扰了 @pby5