mountaindust / Planktos

ABM framework for dispersal modeling
https://planktos.readthedocs.io
GNU General Public License v3.0
5 stars 3 forks source link

Planktos Agent-based Modeling Framework

Welcome to Planktos, a framework for constructing agent-based models of plankton, tiny insects, and other small organisms whose effect on the surrounding fluid can be considered negligable. This is an active research project and work is ongoing.

Check out the online documentation at https://planktos.readthedocs.io.\

If you use this software in your project, please cite the following paper:
Strickland, W.C., Battista, N.A., Hamlet, C.L., Miller, L.A. (2022). Planktos: An agent-based modeling framework for small organism movement and dispersal in a fluid environment with immersed structures. Bulletin of Mathematical Biology, 84(72).

Additionally, the documentation can be sited as:
Strickland, W.C. (2017). Planktos agent-based modeling framework, software documentation. https://planktos.readthedocs.io.

A suggested BibTeX entry for both of these is included in the file Planktos.bib.

This project is supported by the National Science Foundation through award number DMS-2410988, 2024-2027. The opinions, findings, and conclusions or recommendations expressed here are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Installation & Dependencies

Installing FFmpeg

Before using Planktos, FFmpeg must be installed and accessible via the $PATH environment variable in order to save video files of simulation results.

There are a variety of ways to install FFmpeg, such as the official download links, or using your package manager of choice (e.g. sudo apt install ffmpeg on Debian/Ubuntu, brew install ffmpeg on OS X, etc.).

Regardless of how FFmpeg is installed, you can check if your environment path is set correctly by running the ffmpeg command from the terminal, in which case the version information should appear, as in the following example (truncated for brevity):

$ ffmpeg
ffmpeg version 4.3.1 Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 10.2.1 (GCC) 20200726

Note: The actual version information displayed here may vary from one system to another; but if a message such as ffmpeg: command not found appears instead of the version information, FFmpeg is not properly installed.

Installing Planktos

Once FFmpeg is installed, Planktos can be installed from source using pip on Python >= 3.7 from the Planktos directory. Navigate to the Planktos directory in a terminal and use the command:

pip install .

Non-optional depdencencies (other than FFmpeg) should automatically be installed.

Planktos is still in active development and updates occur often. You should therefore pull the source repo often and then reinstall using the same command. To avoid needing to reinstall each time you pull the repo, you can instead install Planktos in "editable" mode (requires pip version >= 21.1):

pip install -e .

Planktos can then be imported like any other Python package from any directory. Either approach also allows you to uninstall with the same command (from the Planktos directory):

pip uninstall .

Running Directly From Source (no install)

I'm assuming you're using Anaconda, and if so, I strongly suggest that you ditch the default package manager conda (which is essentially broken at this point - particularly if you need packages from conda-forge, and we do) for mamba. The commands are the same (it's a drop-in replacement for conda) but it is a C++ solver based on libsolv which manages dependencies for RedHat, Debian, etc. Also, it has multi-threaded downloads and doesn't break when trying to obtain vtk and/or pyvista. Install with the following command:

$ conda install -c conda-forge mamba

Having done that, the dependencies are as follows:

If you want to use the supplied script to convert data from IBAMR into vtk, you will also need a Python 2.7 environment with numpy and VisIt installed (VisIt's Python API is written in Python 2.7).

Tests

All tests can be run by typing pytest into a terminal in the base directory. This requires installation of the optional pytest package.

Overview

Currently, Planktos has built-in capabilities to load either time-independent or time-dependent 2D or 3D fluid velocity data specified on a regular mesh. ASCII vtk format is supported, as well as ASCII vtu files from COMSOL (single-time data only) and NetCDF. More regular grid formats, especially if part of open-source formats, may be supported in the future; please contact the author (cstric12@utk.edu) if you have a format you would like to see supported. A few analytical, 1D flow fields are also available and can be generated in either 2D or 3D environments; these include Brinkman flow, two layer channel flow, and canopy flow. Flow fields can also be extended and tiled in simple ways as appropriate. Mesh data must be time-invariant and loaded via IB2d/IBAMR-style vertex data (2D) or via stl file in 3D. Again, more (open source) formats may be considered if requested.

For agents, there is support for multiple species (swarms) along with individual variation though a Pandas Dataframe property of the swarm class (swarm.props). Individual agents have access to the local flow field through interpolation of the spatial-temporal fluid velocity grid - specifically, Planktos implements a cubic spline in time with linear interpolation in space (Future: tricubic spline in space). In addition to more custom behavior, included in Planktos is an Ito SDE solver (Euler-Maruyama method) for movement specified as an SDE of the type dX_t = \mu dt + \sigma dW_t and an inertial particle behavior for dynamics described by the linearized Maxey-Riley equation (Haller and Sapsis, 2008). These two may be combined, and other, user-supplied ODEs can also be fed into the drift term of the Ito SDE. Finally, agents will treat immersed boundary meshes as solid barriers. Upon encountering an immersed mesh boundary, any remaining movement will be projected onto the mesh. Both concanve and convex mesh joints are supported, and pains have been taken to make the projection algorithm as numerically stable as possible.

Single-time and animation plotting of results is available in 2D and 3D; support for plotting multiple agent species together has not yet been implemented.

Quickstart

There are several working examples in the examples folder, including a 2D simulation, a 2D simulation demonstrating individual variation, a 3D simulation, a simulation utilizing vtk data obtained from IBAMR which is located in the tests/IBAMR_test_data folder, and a simulation demonstrating subclassing of the get_positions method for user-defined agent behavior. There are also two examples demonstrating how to import vertex data (from IB2d and IBAMR), automatically create immersed boundaries out of this data, and then simulate agent movement with these meshes as solid boundaries which the agents respect. More examples will be added as functionality is added. To run any of these examples, change your working directory to the examples directory and then run the desired script.

An important note about immersed boundary meshes: it is assumed that segments of the boundary do not cross except at vertices. This is to keep computational speed up and numerical complexity down. So, especially if you are auto-creating boundaries from vertex data, be sure and check that boundary segments are not intersecting each other away from specified vertices! A quick way to do this is to call environment.plot_envir() after the mesh import is done to visually check that the boundary formed correctly and doesn't cross itself in unexpected ways. There is also a method of the environment class called add_vertices_to_2D_ibmesh which will add vertices at all 2D mesh crossing points, however it's use is discouraged because it results in complex vertices that attach more than two mesh segments and leftover segments that do not contribute to the dynamics at all. Do not expect meshes resulting from this method to have undergone rigorous testing, and running the method will add significant computational overhead due to the need to search for crossings.

When experimenting with different agent behavior than what is prescribed in the swarm class by default (e.g. different movement rules), it is strongly suggested that you subclass swarm (found in framework.py) in an appropriate subfolder. That way, you can keep track of everything you have tried and its outcome.

Research that utilizes this framework can be seen in:

API

Class: environment

Class: swarm