Open xlong0513 opened 4 years ago
Hello!
I'm glad you're taking a look at this -- it's nice to dust off this old code!
As you have noticed, this is actually done in Java, not Python. It's using the old nengo 1.4, which was a Java program, but which allowed you to define models using Python syntax, as it was using Jython (a system that compiles Python to the Java Virtual Machine). So this does not use a normal Python install at all. Instead, you need Nengo 1.4 https://github.com/nengo/nengo-1.4 and you'll run the model using that. That should have timeview
and all the ca.nengo
stuff.
As for why we used Java back then, it was mostly because the first grad student (Bryan Tripp) who was working with Chris Eliasmith was very familiar with Java. Up until then, this sort of modelling was all being done in Matlab, and we wanted to get away from that, and so Bryan built a very nice Java version. That worked very well, and was what we used to build Spaun, and turned out to be surprisingly fast, as the core inner simulation loop was well suited to optimization by the Java Just-In-Time compiler. But it soon became clear that Java didn't make sense as we were getting into running things on GPUs and more exotic neuromorphic hardware, so that's why we did the big rewrite to the modern Python version of Nengo.
Hmm, I'm looking at this in more detail now, and it looks like this is using a slightly modified version of the funcrep
system for visual display. I can't seem to find that modified version anywhere on my current computer, but I might find it in some other archives... However, it is just about the visualization, so we should be able to deal with that if we can get the model itself running.
Let me know if there are any other problems getting this up and running, and I'll look into this funcrep issue.
Hi, @tcstewar , Thanks for the reply. Actually, I did notice nengo-1.4
, and try to use it . However, the problem for me is a little complicated. First, the installation of nengo-1.4
is too complex, without a convenient installation method as nengo with pip
. Second, it seems timeview
only supports Python3.6 or latter, while my environment is default Python3.5.2 on Ubuntu 16, so timeview
is unavailable. I have tried to find alternative module of math.impl
in nengo but failed.
For me, I just want Path Integration code and visualization display results with nengo. Is there a ready-made example in nengo?
It is definitely much easier to install things with pip
-- that's why we switched over to that method! :) Installing nengo-1.4
should just be downloading from the repository, but then you also have to have Java installed, which isn't as common any more, so I could see that being a problem. That will have timeview
built in, though, so if nengo-1.4
runs, then you should be all set.
However, since the model is so small, I thought I'd also take a stab at just rewriting the model for modern nengo versions. I think I have it working! Here's the model:
import csv
import numpy as np
import nengo
v = np.loadtxt('../directions.csv', delimiter=',')
directions = np.loadtxt('../directions.csv', delimiter=',')
transMatX = np.loadtxt('../transMatX.csv', delimiter=',')
transMatY = np.loadtxt('../transMatY.csv', delimiter=',')
freqX = np.loadtxt('../freqX.csv', delimiter=',')
freqY = np.loadtxt('../freqY.csv', delimiter=',')
samples = np.loadtxt('../samples.csv', delimiter=',')
tau = 0.01
deltaT = 0.0001
population_size = 1600;
glength = len(transMatX) + 1 # number of dimensions that represent gaussian
Imat = np.eye(glength-1)
initial_input = np.random.uniform(-1, 1, glength)
zero = np.zeros(glength)
model = nengo.Network()
with model:
def place_input_func(t):
if t<0.005:
return initial_input
else:
return 0
place_input = nengo.Node(lambda t: initial_input if t<0.01 else zero)
control = nengo.Node([0,0])
path_int = nengo.Ensemble(n_neurons=population_size, dimensions=glength+2,
radius=1, encoders=directions, eval_points=samples)
nengo.Connection(place_input, path_int, synapse=tau, transform=np.concatenate([np.eye(glength), np.zeros([2, glength])],0)*tau*20)
nengo.Connection(control, path_int, synapse=tau, transform=np.concatenate([np.zeros([glength, 2]), np.eye(2)],0))
# rotation in the fourier space is shift in the original 2D space
def rotate(x):
# return x[0:25]
x=np.transpose(np.array([x])) #first convert to 2D matrix
Anef = [tau/deltaT] * ((transMatX - Imat) * x[glength] + (transMatY - Imat) * x[glength + 1]) + Imat
x[1:glength] = np.dot(Anef,x[1:glength,])
return np.transpose(x[0:glength])[0]
# feedback transformation
trans = np.concatenate([np.eye(glength), np.zeros([2, glength],'f')],0)
nengo.Connection(path_int, path_int, synapse=tau, function = rotate, transform = trans)
# now let's create the visualization system
params=np.concatenate([freqX[:,None], freqY[:,None]],1)
def basisFunc(x, freqs):
const = np.ones([1,len(x[0])])
basis = np.zeros([glength-1, len(x[0])],'f') # the first (constant) basis is not included
empty = np.zeros([2, len(x[0])],'f')
X = np.array(x[0])
Y = np.array(x[1])
for i in range(glength-1):
iFreq = i//2 + 1 # the first index points to the constant term
if i % 2 == 0:
basis[i] = np.cos(X*2*np.pi*freqs[iFreq, 0] + Y*2*np.pi*freqs[iFreq, 1])
else:
basis[i] = np.sin(X*2*np.pi*freqs[iFreq, 0] + Y*2*np.pi*freqs[iFreq, 1])
# return concatenate([const, basis], 0)
return np.concatenate([const, basis, empty], 0)
n_pts = 50
XX, YY = np.meshgrid(np.linspace(-1, 1, n_pts), np.linspace(-1, 1, n_pts))
basis = basisFunc([XX.flat,YY.flat], params).T
def plot_func(t, x):
scale = 0.2
values = np.dot(basis, x) * scale
# put on a scale of 0 - 255
values *= 255.0
values = values.reshape((n_pts, n_pts))
# flip so up isn't left rightside down
values = np.flipud(values)
# generate png heat map based off values
import base64
from PIL import Image
try:
from cStringIO import StringIO
except ImportError:
from io import BytesIO as StringIO
png = Image.fromarray(values).convert('RGB')
buffer = StringIO()
png.save(buffer, format="JPEG")
img_str = base64.b64encode(buffer.getvalue())
plot_func._nengo_html_ = '''
<svg width="100%%" height="100%%" viewbox="0 0 100 100">
<image width="100%%" height="100%%" xlink:href="data:image/png;base64,%s">
</svg>''' % img_str.decode('utf-8')
plot_func._nengo_html_ = ''
with model:
plot = nengo.Node(plot_func, size_in=path_int.dimensions)
nengo.Connection(path_int, plot)
That should run right inside the modern nengo, including having the graphical visualization (the second half of the code is all just creating the visualizer).
It's still relying on the data generated by that matlab code, though, so we should also try porting that matlab code over to python, and then everything would be much cleaner. But let me know if that works for you in the meantime!
@tcstewar Thanks for the sharing. The code is cool, but got no visualization results, as follows
Anything wrong?
Oops, sorry, I wasn't clear on the instructions to run it. :) In the Nengo GUI, if you want to show plots or interface elements, you right-click on the element and then select an option from the menu. For this particular example, you'll need two things:
1) right-click on control
and select Slider
. That will give you two sliders that control the input to the path integrator (one for velocity in the x-direction, and one for velocity in the y-direction)
2) right-click on plot
and select HTML
. That will pop up the custom display that I created for this particular example which decodes the represented "bump" from the path integrator.
If you create those two and then press the play button, you should see the bump form, and then you can move the sliders and see the bump move.
You can optionally also right-click on path_int
and choose Value
to see the underlying compressed representation of the bump as well.
Great! I got the visualization results just as follows,
Thanks for code. But I wonder if this is the path integration visualization result. In my view, I want to get two path integration trajectories, one for ground truth, the other for Place cells response. Am i something wrong with the concept?
Sorry for the questions if it's not about nengo. Because I am a beginner of Nengo, and I want to build a Grid cell and place cell model for simulation of navigation.
The black and white square plot is exactly the decoded representation of the cells. If you want ground truth, you'll need to add separate code that generates some desired velocity signals (rather than having you manually control that with the sliders). For example, to generate a circular path, change control = nengo.Node([0,0])
to
def control_func(t):
return np.sin(2*np.pi*t), np.cos(2*np.pi*t)
control = nengo.Node(control_func)
If you want to see the individual neuron activities, then you would right-click on the path_int
population and choose "Spikes".
But I think the bigger question is what exactly do you want to do? There are many different theories as to the relationship between grid cells and place cells. Nengo will let you implement any of them, but you do need to decide what exactly you want the system to be. What are your inputs? What are your outputs? What biological aspects do you want in your model? How much realism do you want?
For a more detailed introduction to Nengo, you might want to look here: https://www.nengo.ai/getting-started/ That should cover the basics of how to define models and control and record their inputs and outputs.
@tcstewar I see. I think what i need is to build a cognitive map and navigate with the map, and i want to use continuous attractor model. As far as i know, for a fixed environment, first i can build the cognitive map by place cells firing. Then when entering the environment again, I can predict the location (exactly as RatSLAM if you know it). But I should learn from the basics of nengo. Thanks for the great contribution!
Hi, Bekolay. Thanks for sharing the code. I have got several errors here.
timeview
. What's the Python version?nengo
? I used pip to installnengo
automatically, but it doesn't includemath
module. As follows,from ca.nengo.math.impl import PiecewiseConstantFunction