ReProduceByYQ / Channel_pruning_yq

This repo is re-produce for Channel_pruning
MIT License
11 stars 1 forks source link

about Lasso regression #4

Closed young-fire closed 6 years ago

young-fire commented 6 years ago

When I test it ,after lasso regression ,I do this:
count = sum(idxs) idxs[0:count]=True idxs[count:]=False the number of filters is the same , but the first count filters to preserve. the top5 and top1 is no descend ,why?

young-fire commented 6 years ago

pls test it ,maybe some wrong in my code.

Johnson-yue commented 6 years ago

sorry, I can not understand what would you do ? After Lasso regression, the mask of channel in the idxs , count = sum(idxs) the count should be equal to rank! and it is not successive,so, why you do idxs[0:count]=True idxs[count:]=False This two line code have modified result of Lasso . It is mistake!! Can you tell me what do you want to do ?

young-fire commented 6 years ago

yes,This two line code have modified result of Lasso,Use this result to reshape the W, then the top5 is no decline.

young-fire commented 6 years ago

I mean use this mistake result get the same accuracy.

young-fire commented 6 years ago

oh , i konw ,i use the method named fisrt k , in single-layer it is almost no loss .