Open tlambert03 opened 4 weeks ago
Yes. We'd also like some 3D + Time, annotated ground truth that could be used for all sorts of AI training (segmentation and tracking).
I'm happy to have your help making it more distributable. So far, I've just been playing with it and was able to generate a decent pile of data using this workflow:
input_mesh_generation
: how the simulation should startlaunch_simulation
: create the .vtk surface files for a bunch of timepointsvoxelize
: create 3D voxel (.tiff) files with labels. So far it proved to me that it can work.
This looks quite promising as well.
TLDR: Run this command and it will make a 5 time point (-t 5) version of the data and put it into C:\tmp. FIJI can open the image and label files.
docker run -it -v "C:\tmp:/data" ghcr.io/tp81/cytopacq:master -c /usr/local/config/granulocyte-debug.ini -f /data/im.ics -l /data/lb.ics -e /data/error.log -t 5
Dockerfile and instructions from the first author here: https://github.com/tp81/cytopacq
hey all, hey @dmilkie! :wave:
I'm just getting acquainted here (saw this mentioned in a talk by @kevinyamauchi). i'd love to try to use/depend on this as a ground truth source for microsim ... haven't looked too deeply at dependencies and platform compatibilities yet. But just curious to gauge your interest/openness to making it more distributable