Currently the cyhy-commander uses the Paramiko library to distribute and retrieve work to the scanner instances over SSH. This design sees its origins in the operation of this system pre-AWS migration from an on-premise deployment.
Motivation and context
Although robust, this design limits the ways we can explore alternative, cloud-native products and solutions to improve the system. Instead of using SSH to communicate with scanner instances we can instead use Amazon EFS (Elastic File System) and mount it to the instance running the cyhy-commander as well as each scanner instance.
Implementation notes
The cyhy-commander will be updated to interact with the local filesystem (where the EFS share is mounted) instead of using SSH. The cyhy-runner on each scanner instance will pick up jobs and write the results to the same location, but instead of mounting Amazon EBS (Elastic Block Storage) at that filesystem path it will be the EFS share. When mounting the EFS share on scanner instances we should look at using an EFS access point for each instance to both enforce uid/gid as well as limit the scope of access on the share to only work for that particular instance.
[!NOTE]
We will need to watch during testing to ensure that the throughput to the EFS share is acceptable for this new mode of operation.
Acceptance criteria
How do we know when this work is done?
[ ] The cyhy-commander is updated to use the local filesystem to issue work and retrieve results.
[ ] Just to be complete this should still use an EFS access point to at least enforce uid/gid for files written.
[ ] The CyHy environment is updated to create an EFS share that is mounted to the instance running the cyhy-commander as well as each scanner instance.
[ ] Each scanner instance uses an EFS access point to control access to the share.
TODO
Create issues in respective repositories to reflect the specific work to be done.
💡 Summary
Background
Currently the cyhy-commander uses the Paramiko library to distribute and retrieve work to the scanner instances over SSH. This design sees its origins in the operation of this system pre-AWS migration from an on-premise deployment.
Motivation and context
Although robust, this design limits the ways we can explore alternative, cloud-native products and solutions to improve the system. Instead of using SSH to communicate with scanner instances we can instead use Amazon EFS (Elastic File System) and mount it to the instance running the cyhy-commander as well as each scanner instance.
Implementation notes
The cyhy-commander will be updated to interact with the local filesystem (where the EFS share is mounted) instead of using SSH. The cyhy-runner on each scanner instance will pick up jobs and write the results to the same location, but instead of mounting Amazon EBS (Elastic Block Storage) at that filesystem path it will be the EFS share. When mounting the EFS share on scanner instances we should look at using an EFS access point for each instance to both enforce uid/gid as well as limit the scope of access on the share to only work for that particular instance.
Acceptance criteria
How do we know when this work is done?
TODO
Create issues in respective repositories to reflect the specific work to be done.