Closed andrewlee94 closed 1 year ago
What behavior would you expect if there are inconsistencies in the scaling factors specified at different levels?
Thanks Andrew. I will look into this. I need to figure out what is happening with the suffixes and how they propagate. Thanks for the example.
However, I will also point out that I would NOT use the scaling transformation - I would use the user scaling option in IPOPT directly. It uses the same suffixes. The scaling transformation is there if you want to perform scaling with a solver other than IPOPT. (Having said that, it should work, so I will look into it.)
@qtothec I had not considered inconsistencies at different levels, which is a big issue. The expected behaviour (from a naive user) would be to use the most recently specified scaling factor. That would be a nightmare to implement however, so my first response would be to use the scaling factor defined at the same level as the component is declared.
@carldlaird I would agree that using IPOPT's scaling directly would be the best course, but for a general, inexperienced user we would like a common method that works for everything. Advanced users could then use IPOPT's scaling directly if they know how to do so (or any other solver they wish to use).
@carldlaird This is a bit of an aside, but... does that mean that if a user solves with IPOPT and had done the scaling transformation, the model would be scaled twice?
I don't recall if the scaling transformation removes the scaling suffixes or not. However, you also need to turn on an option to IPOPT to use the scaling, so you would need to explicitly ask for (1) a scaling transformation, and (2) for IPOPT to also use the scaling parameters.
I have been working with the Pyomo scaling transformation in preparation for implementing this within IDAES, and I have run into some unexpected behaviour when working with block-structured models. It appears that only scaling factors declared at the top level of the model are considered by the scaling transformation, and that it does not descend into the block tree to look for additional scaling factors.
I have tested a few cases to try to work out what was happening.
Case 1:
Returns:
ValueError: ScaleModel transformation called with scaling_method='user', but cannot find the suffix 'scaling_factor' on the model
This is possibly expected behaviour, but without the
scaling_factor
suffix present at the top level, the transformation returns an error. I do find it strange that the error is aValueError
- I would generally expect anAttributeError
in the case where the attribute was missing completely.Case 2:
Returns:
The model has been transformed, and the variable
a
replaced byscaled_a
, but the value and bounds have not been scaled. It appears that the scaling factor(s) defined in child blocks are not being picked up by the transformation.Case 3:
Returns:
This works now, and the variable value and bounds have been scaled.
It would be good if the scaling transformation would descend into the block structure to look for scaling factors, as it makes more sense to declare these at the same level as the component is declared. This also simplifies modular construction of the scaling factors, as each block can be responsible for its own scaling factors, without having to propagate these up to the top level of the model.