Open sebjameswml opened 9 years ago
Alex's response of 16 Feb:
Yes, we discussed this. The problem is that it may be impossible in some simulators to support saving delay buffers, so the decision was that we wouldn't save them. This means that the state won't be precisely as it would be if you didn't stop the model, but I don't see a decent way around this.
The issues:
How do you save a delay buffer in a simulator independent way? Can you even get the buffer info from the simulator (in BRAHMS this is either easy or hard - easy if you have different delays on different connection, as the buffer is managed by the component, difficult if you have a fixed delay, as the buffer is managed by the BRAHMS framework)?
I think it is a reasonable limitation, as long as it is made clear to the user.
Alex
Half of this is now implemented in the branch full_delay_handling.
That is, projections with delays now save out the delay buffers.
To complete this ticket, I want to implement (within SpineML_2_BRAHMS) delay handling (and saving) for generic inputs. That means creating a "genericinput" component which connects the two ends of a generic input connection and does the business.
All that work we did to save off the state variables so that you could run a model for, say, 100 ms, then continue it on for 100 ms has a little problem, which @ajc158 already mentioned in #16:
If you have delays in the connections between the populations, you need a record of the delayed state variables in order to be able to continue the model correctly, so the saving code needs to deal with that in some way. Essentially, we have delayed differential equations whose correct operation depends on having old state variable data available - the amount of old state variable data you need depends on the timestep size and the amount of delay in the connection.
The components have a delay buffer when there's a connection delay - we need to store this delay buffer in the model xml or some sort of associated file, and then restore it into the component at run time.