ctn-archive / nengo_theano

ABANDONED; see https://github.com/nengo/nengo instead
MIT License
3 stars 3 forks source link

Network arrays make the same population array_size times #13

Closed studywolf closed 11 years ago

studywolf commented 11 years ago

Instead of making array_size populations with different parameters, it uses the same neuron_num parameters for array_size populations. I'm on it!

studywolf commented 11 years ago

OK, so the everything breaks if we try to keep the neurons in an array size (array_size x neurons_num) so i'm going to rewrite things such that it's all set up as a (total_neurons_num, 1) array and everything indexes in accordingly. This will be way way more efficient and understandable.

hunse commented 11 years ago

I'm not sure if I understand why this is necessary, or why it will be more efficient and understandable. Have you started on this yet, @studywolf ?

studywolf commented 11 years ago

yeah I have, basically all the calculations get complicated if you're trying to do it with a 3D matrix instead of a 2D matrix, and there's a bunch of places you have to watch out for it. But with a single array with them all all you have to do is make sure you split them up appropriately when calculating the decoders, rather than reshaping etc etc during runtime with the 3D matrix.

hunse commented 11 years ago

Ok. I wasn't thinking of 3D arrays, but I guess with (array_size, neurons_num, dimensions) for encoders and the like, that's what you end up with. Carry on then! Let me know when you're finished, and I'll make sure that learning works with arrays.

studywolf commented 11 years ago

will do! I'm shooting to have it up this afternoon, I'll let you know as soon as it is! The learning with arrays test is actually where I first realized the same bias, alpha, encoders were being used for all ensembles in a network array. No longer!

studywolf commented 11 years ago

OK, I've been writing and debugging for a while now and it's still not quite working. I made a branch with the current version called 'n_array', should be able to retrieve it with (I think) git checkout --track origin/n_array the test array_test runs, and shows the problem. It's really strange, I believe that J in ensemble.update isn't being calculated correctly so the right input isn't getting to the neurons, or that the decoders in ensemble_origin.compute_decoders isn't being calculated right. If you increase the number of neurons the amplitude of the signals goes to 0. I am most perplexed. If anyone could take a look and see if they can spot where this is going on that'd be awesome, @hunse, @tcstewar.

I went back to the 3D matrix, because the encoders become a huge sparse matrix in the 2D case. It actually hopefully doesn't read that poorly.

And then the other thing I found for tracking problems is that theano objects can be seen (for sufficiently small number of neurons) with t_obj.eval(), so I've been checking shape with t_obj.eval().shape.

studywolf commented 11 years ago

OK it's up and going, currently there is a problem when array_size and dimensions both = 1, I believe somewhere something is being flattened improperly.

studywolf commented 11 years ago

OK, all the tests pass except decoded_weight_matrix_test, something is wrong in the way the network.connect function handles transforms for decoded_weight_matrix input (i.e. post.neurons x pre.dimensions) for post network arrays

studywolf commented 11 years ago

Alright, everything but learning_test and decoded_weight_matrix test pass, at this point I'm going to make issues for those and merge back into the main branch.