Right now script deletes the tarball between submissions, when it could often reuse the same tar. Not a major deal, but makes submitting take a while for bigger submissions, and wastes gpvm resources making duplicate tars and moving them around. Maybe allow python script to submit multiple jobs at a time, rather than bash script that drives it.
Also maybe make it know about environment e.g. setup root version that was used to build the package (right now it's hard coded to root 6.22). Could simply read an environment variable set at setup on the gpvm.
Right now script deletes the tarball between submissions, when it could often reuse the same tar. Not a major deal, but makes submitting take a while for bigger submissions, and wastes gpvm resources making duplicate tars and moving them around. Maybe allow python script to submit multiple jobs at a time, rather than bash script that drives it.
Also maybe make it know about environment e.g. setup root version that was used to build the package (right now it's hard coded to root 6.22). Could simply read an environment variable set at setup on the gpvm.