Closed davramov closed 1 month ago
As discussed with Dylan, I removed the ALCF_compute_test directory and copied my useful test scripts to the /examples
folder. Additionally, instead of merging the ACLF transfer/compute flow into bl832/move.py,
I have added bl832/ALCF_compute_reconstruction.py
as a separate module.
Note: Data transfer in the following function still needs inputs defined (file source/destination and the required .txt file for reconstruction), which we will determine in Monday's (6/17) meeting with Dula.
@flow(name="alcf_tomopy_reconstruction_flow")
def alcf_tomopy_reconstruction_flow():
I have updated the ALCF_tomopy_reconstruction.py
script and corresponding files (reconstruction.py
, .env
) to account for a file name and folder name when calling the reconstruction Globus Flow. I have also included a specific readme README_ALCF_tomopy_reconstruction.md
with an outline of the entire flow, details about folder and file names, step-by-step instructions for configuring the Globus environments, NERSC and ALCF endpoints (transfer and compute), setting up and logging into the confidential client, and running the script.
July 26 Update
Major changes
Incorporate tiff_to_zarr.py
into Globus Flow on ALCF Polaris
File pruning with Prefect worker scheduling
updated orchestration/flow/bl832/prune.py
Other changes:
config.yml
and orchestration/flows/bl832/config.py
Cleaned up some endpoint names for consistency
docs/README_alcf832.md
updated to reflect recent progress
New files:
create_deployments_832_alcf.sh
This shell script builds and deploys pruning code for this workflow.
examples/Tomopy_for_ALS.ipynb
This notebook is also in another repository, but this version matches the changes made to the registered flow function and confidential client for this specific workflow.
examples/tiff_to_zarr.py
This program reflects the code on Polaris /eagle/IRIbeta/als/examples/tiff_to_zarr.py
used in the Globus Flow.
Moved and renamed files:
examples/test_cc_auth.py
scripts/globus_tomopy_flow_init.py
Todo: update this script so we can move away from using a Jupyter Notebook to initialize the steps.
Globus helper code
orchestration/globus/flows.py
(was globus_flows_utils.py
)
orchestration/globus/transfer.py
(was orchestration/globus.py
)
orchestration/flows/bl832/alcf.py
(was ALCF_tomopy_reconstruction.py
)
docs/README_aclf832.md
(was orchestration/flows/bl832/README_ALCF_tomopy_reconstruction.md
)
Files with modified imports due to moving and renaming the Globus helper code, but are otherwise unchanged:
orchestration/flows/bl7012/config.py
orchestration/flows/bl7012/move.py
orchestration/flows/bl7012/move_recon.py
orchestration/flows/bl832/move.py
orchestration/prefect.py
Test environment
I isolated the file tree for bl832 and dependencies, specifically this Globus reconstruction flow.
File tree
Updates to
orchestration/flows/bl832/move.py
The main changes to the production code are the following fuctions:
[new]
transfer_data_to_alcf()
Followed structure of
transfer_data_to_nersc()
[new]
alcf_tomopy_reconstruction_flow()
Note: this function needs to be updated to take as inputs:
function_inputs = {"rundir": "/eagle/IRIBeta/als/sea_shell_test"}
(used as input forreconstruction.py
)[modified]
process_new_832_file_flow(..., send_to_alcf=False)
Snippet: