Closed gklebanov closed 4 years ago
For the Estimation & Prediction modules, there is a check that is performed in the Utilities section so that you can only download the study package once you have provided the minimum required inputs. We should probably do the same check for the executions section for Estimation & Prediction.
This issue is related to #1811, #1846, and #994 for Estimation since we'd like to have a series of checks that we can use to notify users of any obvious problems.
Looks like it is also related to #387
This issue bundles together validation for several modules in ATLAS:
For Estimation/Prediction, let's use #994 to track the required checks for generating/downloading the study package as this list of checks may be lengthy. For the other ones, let me take a first shot at what should be required and if easier spin off different issues to address each one:
cohort_inclusion
table)@gklebanov @anthonysena Errors are now shown in "Messages" tab (as it is done for cohort definitions) Do you agree?
Expected behavior
All ATLAS analyses (cohorts, characterization, IR, PLE, PLP, TxPathways) should highlight required parameters before allowing users clicking generate buttons. If user has not entered everything, he should be guided to complete design
I would propose to have a flag, or check that would allow to return analysis status as "valid/invalid" as well - to be by execution automation
Actual behavior
In many cases, the analyses allow users to click generate button even if required parameters have not been completed. The execution says "completed" but it actually fails - not obvious and the only way to see it is to decipher logs. So, exception handling should be documented and consistent across ATLAS as well (will create another GitHub issue)
@pbr6cornell - this is causing a lot of headaches for many people today, I propose we target it in 2.8 but will seek your confirmation since it will impact our scope