projectmesa / mesa

Mesa is an open-source Python library for agent-based modeling, ideal for simulating complex systems and exploring emergent behaviors.
https://mesa.readthedocs.io
Other
2.29k stars 844 forks source link

Export the simulation as a video #885

Open nivedkrishnans opened 3 years ago

nivedkrishnans commented 3 years ago

What's the problem this feature will solve? A readily obtainable video output of our simulation would make it easier for collaboration between people working on a model as well as for presentations, publications, etc. A normal screen recording will not do well, as the time required for each step may fluctuate during a simulation. NetLogo offers an option to record the simulation region or even the whole interface (with the sliders, graphs, etc).

Describe the solution you'd like Just a thought: It would be better if we can obtain the exact same thing as the browser visualization as a video. Maybe we can screen capture the active elements of the screen during every step and store it temporarily, and at the end of the simulation, stitch them together (say 25 steps per second) and output it into a predefined location/filename.

EDIT: I came across a few python packages that can 'program' svgs. We can assign some shape for every agent and then put them together at the appropriate coordinates to get a vector image for each step. Maybe also display the parameters of the model, etc.

rht commented 9 months ago

Temporary solution is to record the browser screen using e.g. https://chrome.google.com/webstore/detail/screen-recorder/hniebljpgcogalllopnjokppmgbhaden.

But given that we use Solara now, it should be easier to capture the Matplotlib space state as pictures that can be turned to animated GIF.

vagechirkov commented 9 months ago

Temporary solution is to record the browser screen using e.g. https://chrome.google.com/webstore/detail/screen-recorder/hniebljpgcogalllopnjokppmgbhaden.

But given that we use Solara now, it should be easier to capture the Matplotlib space state as pictures that can be turned to animated GIF.

The second option (to capture the matplotlib space state as images) sounds good to me. Any hints on where to start with this? I'm happy to work on this feature if it might be useful to others.

rht commented 9 months ago

A quick hack to test the idea is to add a line after https://github.com/projectmesa/mesa/blob/9c9a02e25df7c83c866ae9a23416c5d32e76a48d/mesa/experimental/jupyter_viz.py#L262 to save the plot to a file, which filename is an increasing number. And then combine the pictures into 1 file via imagemagick CLI.

rht commented 9 months ago

https://askubuntu.com/questions/648244/how-do-i-create-an-animated-gif-from-still-images-preferably-with-the-command-l

Corvince commented 9 months ago

You can also have a look at https://matplotlib.org/stable/api/animation_api.html

Especially the examples on the bottom are helpful to create an animation. No need to use solara here, you can also just do your plots normally and call model.step() inside the Animation function.

rht commented 9 months ago

That animation API uses set_data (which is more performant than Solara current update method, which recreates the plot object from scratch every time), which Solara doesn't use. It means there needs to be a separate plot object that is updated using this method after a model step. Not ideal, but if it works, it's fine I suppose.

vagechirkov commented 9 months ago

A quick hack to test the idea is to add a line after

https://github.com/projectmesa/mesa/blob/9c9a02e25df7c83c866ae9a23416c5d32e76a48d/mesa/experimental/jupyter_viz.py#L262 to save the plot to a file, which filename is an increasing number. And then combine the pictures into 1 file via imagemagick CLI.

This works well 🎉 I think it might be the easiest integration with the existing infrastructure.

space_ax.set_axis_off()
space_fig.savefig(f"space_{model.schedule.steps}.png")

However, I personally prefer to have a separate function to generate gif for a given model. I guess Matplotlib's animation API is the best option then.

vagechirkov commented 9 months ago

Here is the implementation using matplotlib.animation:

Show code ```python import matplotlib.animation as animation from matplotlib.figure import Figure from mesa.experimental.jupyter_viz import JupyterContainer def plot_n_steps(viz_container: JupyterContainer, n_steps: int = 10): model = viz_container.model_class(**viz_container.model_params_input, **viz_container.model_params_fixed) space_fig = Figure(figsize=(10, 10)) space_ax = space_fig.subplots() space_ax.set_axis_off() # set limits to grid size space_ax.set_xlim(0, model.grid.width) space_ax.set_ylim(0, model.grid.height) # set equal aspect ratio space_ax.set_aspect('equal', adjustable='box') scatter = space_ax.scatter(**viz_container.portray(model.grid)) def update_grid(_scatter, data): _scatter.set_offsets(list(zip(data["x"], data["y"]))) if "c" in data: _scatter.set_color(data["c"]) if "s" in data: _scatter.set_sizes(data["s"]) return _scatter def animate(_): if model.running: model.step() return update_grid(scatter, viz_container.portray(model.grid)) ani = animation.FuncAnimation(space_fig, animate, repeat=True, frames=n_steps, interval=400) # To save the animation using Pillow as a gif writer = animation.PillowWriter(fps=15, metadata=dict(artist='Me'), bitrate=1800) ani.save('scatter.gif', writer=writer) ```

It's actually pretty fast: ~10 seconds for 1000 steps and 5 agents. Please let me know what you think @rht @Corvince!

rht commented 9 months ago

We could incorporate your code, but there is a possibility that we are migrating the plotting to Altair instead (see #1806). This is still up for discussion.

vagechirkov commented 7 months ago

We could incorporate your code, but there is a possibility that we are migrating the plotting to Altair instead (see #1806). This is still up for discussion.

Just let me know if this or something like it might be useful, I am actively using this to produce gifs to check the model when running on the compute cluster.