Closed AUTProgram closed 1 year ago
FWIW initialise
is not a method that the user is ever intended to interact with directly -- it's only intended to be used internally e.g. within estimate_parameters
. @AUTProgram if we renamed it to _initialise
and update the docstring to reflect that would it help your confusion @AUTProgram?
The other thing we could do is rename initial_value
to estimate
to make it clear that it's something used instead of finding a value in estimate_parameters
. And then have an _initialised_value
which users don't use, but is controlled within _initialise
.
class Model:
def estimate_parameters(self, x, y, model_parameters):
if model_parameters["param"].needs_initialising():
model_parameters["param"].initialise(some_value)
(NB if you write python
after the triple "`"s then it does syntax highlighting for you).
The only intention of the current initialise
method is to simplify this boilerplate within estimate_parameters
.
So, how about:
class ModelParameter:
fixed_to: Optional[float] = None
estimated_value: Optional[float] = None
_initialised_to: Optional[float] = dataclasses.filed(init=False, default=None)
def _initialise(self, default: float):
if self.fixed_to is not None and self.estimated_value is not None:
raise ValueError("Fixed parameters must not have estimated values")
if self.fixed_to is not None:
self._initialised_to = self.fixed_to
elif self.estimated_value is not None:
self._initialised_to = self.estimated_value
else:
self._initialised_to = default
return self._initialised_to
class Model:
def estimate_parameters(self, x, y, model_parameters):
model_parameters["param"]._initialise(some_value)
@hartytp I believe this has been implemented in https://github.com/OxIonics/ionics_fits/pull/45, can be closed?
your issue. If you're happy with it, please do close
The current way of initialising parameters is described correctly in the docstrings for
ModelParameter.initialise
andModelParameter.get_initial_value
, but may feel unintuitive for users. I introduced a bug in one of my scripts because I assumed the wrong initial parameter value.For example, if one does
then param.initialised_to = 0.9 afterwards. So the argument passed to
initialise
does not necessarily have any effect on the value to which the parameter is initialised. Rather, it's just a fallback in case there's no other value that takes precedence. I think it's worth thinking about whether we can improve this part of the codebase.Firstly, I find the word
initialise
already a bit ambiguous because I would argue that a parameter that is fixed during a fit does not need to be assigned an initial value in the sense of "initial parameter estimate for curve fitting". So for parameters that are fixed during a fit (for whichfixed_to is not None
), it's questionable whetherparam.initialise()
has the same meaning as for parameters that are floated. As far as I can tell, for fixed parameters the attributeinitialised_to
is not actually used by theFitter._fit
method anyway.I believe conceptually there's only three states a parameter can be in: -) fixed_to a certain value, in which case it does not need an initial value -) not fixed and an initial value has been specified by user -) not fixed and no initial value has been specified by user, in which case it still needs to be given an initial estimate by the model routine before fitting
I think one way of reflecting these options would be (leaving out the attributes and methods not relevant here):
This would mean that if the user has set
param.fixed_to
orparam.initialised_to
before fitting, those values would just be used directly. Ifinitialise
is called on a parameter, the user can be certain that the value passed will be the value thatinitialised_to
is set to afterwards. Note that a fixed parameter would haveself.initialised_to = None
, while a floated parameter would haveself.fixed_to = None
. However, a downside of this approach is that theif model_parameters["param"].needs_initialising()
statement has to be repeated for every parameter, which I don't like.Not completely sure about the right way of doing things here, but I thought I'd put it out there to see what other people think.