ioam / topographica

A general-purpose neural simulator focusing on topographic maps.
topographica.org
BSD 3-Clause "New" or "Revised" License
53 stars 32 forks source link

Could not find class 'topo.misc.legacy.JointScaling' to restore its parameter values ? #666

Closed dancehours closed 7 years ago

dancehours commented 7 years ago

Dear Bednar,

I run simulations based on snapshot obtained before, for example, I want to run 2000 iterations based on GCAL_10000.typ. So what I do is, firstly I use load_snapshot to load " GCAL_10000.typ", then I run

topo.sim() times = [2000*i for i in range(2)] data=c(times=times)

from this step I can get the snapshot GCAL_12000.typ which is the snapshot of the 10000th and 12000th iteration. But the problem occurs when I load GCAL_12000.typ and want to continue run simulations based on it. I get a warning ( maybe can be regarded as an error)

"WARNING:root:Time: 000000.00 Parameterized01246: Could not find class 'topo.misc.legacy.JointScaling' to restore its parameter values (class might have been removed or renamed; if you are using this class, please file a support request via topographica.org)."

As I understand, I get nothing by running simulations based on the second snapshot GCAL_12000.typ because GCAL_12000.typ is the data from 10000 to 12000 iteration,lacking previous activities' data.

I would like to ask, how to fix this problem ? Since it is important for me that I need to store snapshots by steps in case of simulations interrupted by accidents .

jbednar commented 7 years ago

Here it looks like you're using a class that's no longer supported. Maybe you need to add the -l option when you are launching Topographica, before reloading your snapshot? I haven't used that option in many years, and don't remember how it works.

In any case, GCAL_12000.typ will normally contain the weights at that time (12000) and the activities at that time, with no data from previous times. But it sounds like that's what you want anyway.

dancehours commented 7 years ago

Now it is solved. Thanks.