ybsong00 / Vital_release

VITAL: VIsual Tracking via Adversarial Learning
BSD 2-Clause "Simplified" License
129 stars 37 forks source link

关于G网络训练的疑问 #7

Open Rheelt opened 6 years ago

Rheelt commented 6 years ago

在G_pretrain.m文件中代码摘要如下 `prob_k=zeros(9,1); for k=1:9

row=floor((k-1)/3)+1; col=mod((k-1),3)+1;

for i=1:nBatches batch = pos_data(:,:,:,opts.batchSize(i-1)+1:min(end,opts.batchSizei)); batch(col,row,:,:)=0; if(opts.useGpu) batch = gpuArray(batch); end res = vl_simplenn(net_fc, batch, [], [], ... 'disableDropout', true, ... 'conserveMemory', true, ... 'sync', true) ;

f = gather(res(end).x) ;
if ~exist('feat','var')
    feat = zeros(size(f,1),size(f,2),size(f,3),n,'single');
end
feat(:,:,:,opts.batchSize*(i-1)+1:min(end,opts.batchSize*i)) = f;    

end

X=feat;
E = exp(bsxfun(@minus, X, max(X,[],3))) ;
L = sum(E,3) ;
Y = bsxfun(@rdivide, E, L) ;
prob_k(k)=sum(Y(1,1,1,:));

end [~,idx]=min(prob_k);`

其中[~,idx]=min(prob_k)此处选择出来的idx对应的mask,所产生的D网络的loss不是最大的,而是最小的。这样选出的mask与论文中所阐明的选择方法不同。 @22wei22

Rheelt commented 6 years ago

@yl1991

ybsong00 commented 5 years ago

We do softmax after network output. So the prob_k means the output probability rather than loss.

vincennnnt commented 5 years ago

@Rheelt 你好,在代码中的cost sensitive loss是在哪里调用的,不是根据这个loss来更新mask的权重么,然后我在代码并没有找到调用的地方。