Closed AnthonyLim23 closed 2 years ago
Below is a script to create a prototype of a pseudo global optimizer.
This algorithm has the correct logic and ideas. The user defines a maximum number of attempts for the loop. In each loop we do a fit and then do a second fit with 3 iterations. If the second fit escapes the local minima then we start the loop again. To define if the local minima has been escaped we use a percentage change in the value.
The final version will need to be C++ and inherit from the fitting algorithm as it will need to work with the multidomain fitting.
from mantid.simpleapi import *
import matplotlib.pyplot as plt
import numpy as np
from mantid.api import *
from mantid.kernel import *
class pseudoGlobalFit(PythonAlgorithm):
def summary(self):
return "pseudo global fit"
def category(self):
return "Muon"
#------------------------------------------------------------------------------------------------
#def doFit(function, ws, startX, endX, maxIterations=1000):
# mant = Fit(Function=function, MaxIterations=maxIterations, InputWorkspace=ws, CreateOutput=True, StartX=startX, EndX=endX)
# return mant
def PyInit(self):
self.declareProperty("MaxLoops", 10)
self.declareProperty("tolerance", 1.0, Direction.Input)
self.declareProperty("StartX", 0.0, Direction.Input)
self.declareProperty("EndX", 15.0, Direction.Input)
self.declareProperty(WorkspaceProperty("InputWorkspace",
"",
direction=Direction.Input))
self.declareProperty(FunctionProperty("Function", direction=Direction.InOut), doc="")
def significant_change(self, table1, table2, tol):
for j in range(len(table1.column(1))):
print("moo", table1.column(1)[j], table2.column(1)[j], np.abs(table1.column(1)[j]- table2.column(1)[j])/table1.column(1)[j])
if np.abs(table1.column(1)[j]- table2.column(1)[j])/table1.column(1)[j] > tol:
return True
return False
def doFit(self, function, ws, start, end, maxIterations=1000):
mant = Fit(Function=function, MaxIterations=maxIterations, InputWorkspace=ws, CreateOutput=True, StartX=start, EndX=end)
return mant
def PyExec(self):
#fit_props = {}
#global_props = ["MaxLoops", "tolerance"]
#for prop in self.getProperties():
# if prop.name not in global_props:
# fit_props[prop.name]=prop.value
function = GausDecay(A=0.04912,Sigma=0.799829)+FlatBackground(A0=0.24281)#self.getProperty("Function").value
start = self.getProperty("StartX").value
end = self.getProperty("EndX").value
tol = self.getProperty("tolerance").value/100.
ws = self.getProperty("InputWorkspace").value
for j in range(self.getProperty("MaxLoops").value):
print("iteration",j, function)
out_ws = self.doFit(function, ws, start, end)
test_ws = self.doFit(out_ws.Function, ws, start, end, 3)
if not self.significant_change(out_ws.OutputParameters, test_ws.OutputParameters, tol):
return
function = out_ws.Function
return
AlgorithmFactory.subscribe(pseudoGlobalFit)
func = GausDecay(A=0.04912,Sigma=0.799829)+ FlatBackground(A0=0.242810)
def create_alg(name, loops, tol, start, end, func):
alg = mantid.AlgorithmManager.create("pseudoGlobalFit")
alg.initialize()
alg.setAlwaysStoreInADS(True)
alg.setProperty("MaxLoops", loops)
alg.setProperty("tolerance", tol)
alg.setProperty("StartX", start)
alg.setProperty("EndX", end)
alg.setProperty("InputWorkspace", name)
alg.setProperty("Function", func)
alg.execute()
#return retrieve_ws(name)
create_alg('MUSR86009; Pair Asym; long; MA', 100, 0.1, 0.116, 15, 'name=GausDecay,A=0.04912,Sigma=0.799829;name=FlatBackground,A0=0.242810')
This issue has been automatically marked as stale because it has not had activity in 6 months. It will be closed in 7 days if no further activity occurs. Allowing issues to close as stale helps us filter out issues which can wait for future development time. All issues closed by stale bot act like normal issues; they can be searched for, commented on or reopened at any point. If you'd like a closed stale issue to be considered, feel free to either re-open the issue directly or contact a developer. To extend the lifetime of an issue please comment below, it helps us see that this is still affecting you and you want it fixed in the near-future. Extending the lifetime of an issue may cause the development team to prioritise it over other issues, which may be closed as stale instead.
This issue has been closed automatically. If this still affects you please re-open this issue with a comment or contact us so we can look into resolving it.
The fitting in Mantid finds the local minima when given a bad starting guess. This is not surprising, Mantid uses a local minimizer and found a local minima.
However, if the user starts a fit at this local minima it finds a close approximation to the global minima. I suspect this is because the first fit was using small increments to check if the minimisation could continue (its sensible to use smaller steps as you approach the minima). The second fit did not know it was at a local minima and did a larger step that took it out of the minima. This tells me that the minima is a shallow one.
Here is a script to demonstrate it, it also shows that scipy does just as badly