pluskid / Mocha.jl

Deep Learning framework for Julia
Other
1.29k stars 254 forks source link

Batchsize bug #159

Closed lqh20 closed 8 years ago

lqh20 commented 8 years ago

The batchsize specified in the data layer seems to be half the batchsize actually used.

We used a the following snippet of code to count the number of minibatches iterated over in an epoch.

function forward_epoch_with_loss(net::Net)
    ll = 0.;
    epoch = get_epoch(net)
    count_batches = 0;
    while get_epoch(net) == epoch
        count_batches += 1
        result_forward = forward(net, 0.0)
        ll -= forward(net)

            # DEBUG
            #println("LL from forward: $(result_forward)")
        end

        # DEBUG
        println("Did $(count_batches) batches")

        return (count_batches, ll)
end

The results are out by a factor of 2 when compared to the expected (data set size)/(batchsize). It seems that either only half the data is being used or the batchsize is twice as large as it should be.

pluskid commented 8 years ago

Because you forward-ed twice during each iteration, so number of total iterations is reduced by a factor of 2.