IDAES / idaes-pse

The IDAES Process Systems Engineering Framework
https://idaes-pse.readthedocs.io/
Other
208 stars 227 forks source link

Model Diagnostics Checklist #1222

Open andrewlee94 opened 1 year ago

andrewlee94 commented 1 year ago

Related to #1208

This issue is to track general steps and tasks related to the new DiagnosticsToolbox class.

First Steps:

Next Steps:

Possible extensions:

- [ ] Tool to detect linearly dependent equations?

adam-a-a commented 10 months ago

@andrewlee94 I just tried using the new toolbox on a complex flowsheet ("extended" BSM2 flowsheet involving activated sludge and anaerobic digestion models that you formerly worked on in WaterTAP), and I was thinking that it would be a nice bonus to be able to have a method that, after a failed solve, write the IPOPT output as well as other useful information, especially, constraint residuals, to a file for the user to more easily review. Right now, I am doing this by running

results = solve_flowsheet(model)
dt.report_numerical_issues()
dt.display_constraints_with_large_residuals()
dt.display_variables_at_or_outside_bounds()
dt.display_variables_with_extreme_jacobians()
dt.display_constraints_with_extreme_jacobians()
print_infeasible_constraints(model)

The print_infeasible_constraints() function is from watertap.core.util.infeasible.model_diagnostics.infeasible and I could technically write these to their own file (the function provides that capability). I thought it'd be nice to essentially slap a report of numerical issues, with IPOPT output, a list of infeasible constraints with residual values, and whatever else we think would be useful into some external file for the user to (a) keep a record and (b) more easily sift through the report. Hopefully that made sense.

On a side note, I probably should double-check your implementation of methods regarding extreme_jacobians before mentioning this, but I think I am getting different results when using the new methods to list extreme jacobian values in comparison with existing methods, e.g., that check extreme jac rows and columns (i.e., one list of variables with extreme jac values is longer than the other and contains different values). Additionally, when I get the initial report from report_numerical_issues(), I notice that the warning and caution sections list a different number of extreme values to be wary of, e.g.: image

Ignoring the horrible results here (set max_iter to 1 right before restoration), my question is what is the difference between what is shown in the Warning section vs Caution section? For example, the Warning section states there are 3 variables with extreme jac values, but the Caution section states that there are 286 variables with extreme jac values. Apologies in advance for the loaded post.

andrewlee94 commented 10 months ago

@adam-a-a I'll start with the Jacobian's as those are easy to mention. The DiagnosticsToolbox has two sets of tolerances for the Jacobian, one for warnings and one for cautions, with the aim of having a warning only trigger on really bad cases, but the cautions showing a broader list of not-great-but-maybe-ok values. These tolerances are probably slightly different to the defaults used by the underlying methods, hence the different results.

Regarding printing to a file, all of the report and display methods take a stream argument, which can be a file read/write object.

adam-a-a commented 10 months ago

@andrewlee94 Thanks for the clarification. I wonder if we'd want to hint at that in the printout from report_numerical_issues (i.e., Warning--> more severe, Caution --> less severe, maybe with tolerances considered for each.) With a little thought, one could logically deduce that this is the case, but it wasn't immediately apparent to me whether this was intentional or some sort of mistake with the method (again, I probably should've dug into the code to find out).

EDIT: later noticed that the doc string does explain this already.

jsiirola commented 10 months ago

Seconding @adam-a-a's comment: it would be helpful if the Warnings / Cautions included the thresholds for each of the warnings; e.g.:

Warning: 3 Variables with extreme Jacobian values (<1e-8 or >1e8)
...
Caution: 286 Variables with extreme Jacobian values (<1e-4 or >1e4)