GeorgiadouAntigoni / weak_lensing_machine_learning

1 stars 0 forks source link

Test impact of neuron numbers in Architecture 0 #2

Open bnord opened 5 years ago

bnord commented 5 years ago
GeorgiadouAntigoni commented 5 years ago

unknown Plot of sigma vs constant multiplying number of neurons with 3 iterations without changing the seed

bnord commented 5 years ago

But, we didn't plot all three iterations at each constant value, right?

GeorgiadouAntigoni commented 5 years ago

Looped the AE with the Architecture 0 over 5 alp's (constants multiplying the # of neurons) and for each alp, did 10 iterations for 10 different seeds. This is a plot of sigma vs alp value, with error bars on sigma related to the distribution of results from the iterations alp =5, # of iterations =10, epochs = 500 sigma_vs_alp_w_errors_seed10_alp5_ep500-2 copy

GeorgiadouAntigoni commented 5 years ago

Will loop over 100 iterations, instead of 10 and get back with the results.

GeorgiadouAntigoni commented 5 years ago

Plot showing sigma versus the constant multiplying the # of neurons of each layer. I did 50 iterations (for the 50 different seeds), for alp =5 and epochs = 500: sigma_vs_alp_w_errors_seed50_alp5_ep500 copy

bnord commented 5 years ago

So, to me it looks like this flattens out with increasing neurons. I could imagine that it would get better, but not by much.

What do you think?

GeorgiadouAntigoni commented 5 years ago

I think we can benefit if we increase the number of neurons but I don't see much difference between x2 and x5 implementation. Probably if we double the size we can gain in precision and not loose substantially in speed.

bnord commented 5 years ago

I agree with that.

Do you mean double from x5 or double from x2?

bnord commented 5 years ago

Also, are you recording the run times for each of the trainings?

GeorgiadouAntigoni commented 5 years ago

I mean to double the # of neurons in Arch0, which means to go with the x2 implementation.

GeorgiadouAntigoni commented 5 years ago

I don't record the run times... I'll start doing that from now on.