BiAPoL / blog

Most of us study and work at the Bio-image Analysis Technology Development group at the DFG Cluster of Excellence “Physics of Life” at the TU Dresden. We blog about image data science, knowledge exchange and research data management in the life sciences.
https://biapol.github.io/blog/
Creative Commons Attribution 4.0 International
9 stars 8 forks source link

added GPU-accelerated image processing on the TUD HPC cluster #50

Closed thawn closed 1 year ago

thawn commented 1 year ago

First draft of GPU-accelerated image processing on the TUD HPC cluster.

Now as a new branch on BiAPoL/blog

Feedback very welcome :)

thawn commented 1 year ago

@haesleinhuepf @jo-mueller thanks a lot for your excellent feedback :)

I have two remaining questions:

  1. should we publish the blog article now or wait for the part about moving data?
  2. after the discussion here (and also the discussion with Melissa, Jessica and Fabian) it seems to me that it would be sensible to make the singularity-devbio-napari repository public. Is that o.k./possible with the math gitlab? It will be difficult to migrate the CI workflow to github, and neither github.com nor gitlab.com will allow us to store tens of gigabytes of singularity images (at least not without paying for the storage). The most feasible alternative might be the TU Chemnitz gitlab. But I would ask them if they are o.k. with us storing tens of gigabytes of image files in their container registry first.
haesleinhuepf commented 1 year ago
  1. should we publish the blog article now or wait for the part about moving data?

I would wait until opening images from the fileserver works. Otherwise the blog post is a toothless tiger 😉

2. make the singularity-devbio-napari repository public

Yes, do it! (If not yet done)

Love to see progress here! Cool projects ahead 🤓

thawn commented 1 year ago

@haesleinhuepf I just added the section: Transfer Data to/from the HPC cluster :-)

Since I am neither PI nor contact person on our HPC project, I cannot see the links and user interface needed to apply for storage on the HPC cluster. Could you please elaborate in Step 1?

jo-mueller commented 1 year ago

@haesleinhuepf I just added the section: Transfer Data to/from the HPC cluster :-)

Since I am neither PI nor contact person on our HPC project, I cannot see the links and user interface needed to apply for storage on the HPC cluster. Could you please elaborate in Step 1?

Hi @thawn ,

maybe what you are looking for is this link?

haesleinhuepf commented 1 year ago

You can apply for a fileserver / project space in the ZIH self-service portal. Then, you need to ask HPC support via email to mount it on the transfer nodes ...

thawn commented 1 year ago

@haesleinhuepf I got the part about how to get the fileserver space.

What I could not figure out was how to apply for the project space on the cluster (i.e. how did we get /projects/p_bioimage/? Do you also apply for that on the self-service portal, or does that come automatically with the HPC project?

haesleinhuepf commented 1 year ago

No, afaik, you need to send an email to hpc-support.

jo-mueller commented 1 year ago

What I could not figure out was how to apply for the project space on the cluster (i.e. how did we get /projects/p_bioimage/?)

I think this may be what you are looking for

thawn commented 1 year ago

@haesleinhuepf here is a small status update:

Here is a short summary from my test session with Conni:

For the next steps I suggest the following:

thawn commented 1 year ago

@haesleinhuepf The blog posts are ready for another round of review ;-)