fiji-hpc / hpc-workflow-manager-full

3 stars 1 forks source link

add button to request "wipe all my data" from the cluster #7

Open xulman opened 3 years ago

xulman commented 3 years ago

from NEUBIAS, people asked how long does their data stay with us:

for users' comfort and good feeling, we should add button that clears up user's space on the cluster

velissarious commented 3 years ago

Initially the data along with the script were deleted when the HPC-Workflow-Manager job (to which the data are tied to) was deleted by the user. This feature has been disabled because uploading huge datasets is time consuming and users may want to re-use datasets in other jobs. Deleting a job currently removes local data but maintains the data in the remote cluster.

Currently the user can delete their data manually by connecting with SSH.

I have been thinking of adding a different way to upload and download data separately from the scripts. This could be a new window titled "Data Manager" which would have a list of datasets (remote directories) which the user could upload or delete one by one. These would be available for use/re-use by the scripts uploaded separately by the users.

Alternatively a dialog could be added asking the user if they wish to delete the data associated with a job when deleting it. But this solution would not be ideal as they would have to delete data of jobs manually in the case the chose not to the first time.