Closed yasergey9 closed 2 years ago
I been used Ubuntu 20.04 LTS on Oracle VM VirtualBox 6.1.
Also, I did the analysis by template https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/binnededisp_tutorial.html and I have got the result: -log(Likelihood)<0
Can you follow the analysis for 3C 279 as in https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/binnededisp_tutorial.html If you get the same results, then the issue might be a problem of convergence in your model. The following link discusses how to deal with convergence in python https://fermi.gsfc.nasa.gov/ssc/data/p7rep/analysis/scitools/python_tutorial.html
Thank you for your answer, I also worked with the data of 3C 279 as in https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/binnededisp_tutorial.html. The result was the same error.
Can you attach the 3C 279 results?.
I think this is the same issue as reported in Issue 88, which is a plotting bug that @dagorym was working on last year. Unfortunately, I don't think he's had much time to work on it since then. I believe the plotting was working when using the python interface. You might want to look at the Python Likelihood tutorial.
Can you attach the 3C 279 results?.
Yes, but in this case I have got this error:
print(likeobj.getRetCode()) 156
So, NewMinuit didn't converged. Likelihood_problem.txt
One option is to freeze some of the surrounding sources in the xml file. You can do that with some of the flags in make4fglxml.py file (please read the python file for options). For instance -r will freeze all sources beyond a certain radius in degrees, -s will freeze sources below a certain significance. Alternatively, as mentioned before, you can refit inside python as explained in the following link after a 156 result https://fermi.gsfc.nasa.gov/ssc/data/p7rep/analysis/scitools/python_tutorial.html
Thank you, I will try to do these steps.
This particular error (the 'dict' object has no attribute 'iteritems'. part) looks like a python 2 to python 3 conversion error. I've seen that error before. Looks like the automatic conversion tool I used missed that one. I'll see if I can track it down from the information you posted but a full stack trace would be helpful. Thanks.
On Thu, Mar 11, 2021 at 3:40 AM yasergey9 @.***> wrote:
One option is to freeze some of the surrounding sources in the xml file. You can do that with some of the flags in make4fglxml.py file (please read the python file for options). For instance -r will freeze all sources beyond a certain radius in degrees, -s will freeze sources below a certain significance. Alternatively, as mentioned before, you can refit inside python as explained in the following link after a 156 result
https://fermi.gsfc.nasa.gov/ssc/data/p7rep/analysis/scitools/python_tutorial.html
Hello, I was doing this tutorial https://fermi.gsfc.nasa.gov/ssc/data/p7rep/analysis/scitools/python_tutorial.html, but when I started using the print() function to display sources with TS<0, I got an error:
for source,TS in sourceDetails.iteritems(): ... print(source,TS) ... Traceback (most recent call last): File "
", line 1, in AttributeError: 'dict' object has no attribute 'iteritems'. What could be wrong with this attribute?
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/fermi-lat/Likelihood/issues/105#issuecomment-796641615, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA3LPSPZKPVOLLJCFBI4TXLTDCMYVANCNFSM4YRQDFEA .
This particular error (the 'dict' object has no attribute 'iteritems'. part) looks like a python 2 to python 3 conversion error. I've seen that error before. Looks like the automatic conversion tool I used missed that one. I'll see if I can track it down from the information you posted but a full stack trace would be helpful. Thanks.
Thanks, I changed attribute 'iteritems' to 'items'. In this case, everything is fine.
Actually looking at it more, it's not a code error but a error in the tutorial that didn't get updated. The correct command should be
for source,TS in sourceDetails.items():
instead of
for source,TS in sourceDetails.iteritems():
The iteritems() function was removed/changed to just items in Python 3. We need to update the documentation on that.
Thank you!
Hello, I tried to get rid of the problem in Binned analysis (python)
>> > print(likeobj.getRetCode())
156
using the tutorial https://fermi.gsfc.nasa.gov/ssc/data/p7rep/analysis/scitools/python_tutorial.html.
I removed all sources with TS<25 and set the value to
like.tol = 0.0001
.
But still
> > > print (likeobj.getRetCode())
156
.
Please tell me what the problem is?
no_convergence.txt
Unfortunately, each fit is different and one has to keep going until convergence iteratively. You might try removing all sources with TS < 40 and trying again.
Ok, thanks.
Hello! I have successfully performed a 3C 279 binned analysis. But when I started 4FGL J1836. 2 + 5925 binned analysis I got the result
log(Likelihood) < 0
.
I didn't find any information on how to deal with this on the https://fermi.gsfc.nasa.gov/ssc/data/p7rep/analysis/scitools/python_tutorial.html. I tried to consistently remove sources with
TS < 100.0 and < 15000.0)
. It didn't help. Please tell me how to solve this problem.
I have run a quick analysis for your source following the binned analysis as explained here without any issues https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/binned_likelihood_tutorial.html The only change is that I have used
python make4FGLxml.py gll_psc_v27.fit j1835_binned_gti.fits -o j1835_input_model.xml -r 1.0
You can attempt to increase the r size gradually to see how the results vary.
I did a similar action, but got something weird again.
This could be because I'm analyzing data in a large time range from START (239557417) to 491809456 (in MET)? (~8 years)
If you are using a 15 degree radius, there is a bright source in the corner of the ROI that might be causing some issues in the convergence. Could you try a 10-degree radius ROI?. Also please download the XML version of the catalog here
https://fermi.gsfc.nasa.gov/ssc/data/access/lat/10yr_catalog/gll_psc_v26.xml
You can create the XML input file with python make4FGLxml.py gll_psc_v26.xml j1835_binned_gti.fits -o j1835_input.xml -r 1.0
In the input XML file, change the prefactor scale to 1E-9. Then rerun gtsrcmaps and gtlike. Please let us know if that works.
If you are using a 15 degree radius, there is a bright source in the corner of the ROI that might be causing some issues in the convergence. Could you try a 10-degree radius ROI?. Also please download the XML version of the catalog here
https://fermi.gsfc.nasa.gov/ssc/data/access/lat/10yr_catalog/gll_psc_v26.xml
You can create the XML input file with python make4FGLxml.py gll_psc_v26.xml j1835_binned_gti.fits -o j1835_input.xml -r 1.0
In the input XML file, change the prefactor scale to 1E-9. Then rerun gtsrcmaps and gtlike. Please let us know if that works.
Hello! Should I change prefactor scale to 1E-9 for all sources with type="Power Law" and "PLSuperExpCutoff2"?
Only for J1836.2+5925
Only for J1836.2+5925
ok
So, the result is new.txt -log(Likelihood)<0 I been used ROI=10, prefactor scale(J1836.2+5925)=1e-9
Please try with the complete XML with all neighboring sources python make4FGLxml.py gll_psc_v26.xml j1835_binned_gti.fits -o j1835_input.xml
Change prefactor scale for 1836.2+5925 in the XML to 1E-9, rerun gtsrcmaps and gtlike
new_1.txt -log(Likelihood)<0
Please try with the complete XML with all neighboring sources python make4FGLxml.py gll_psc_v26.xml j1835_binned_gti.fits -o j1835_input.xml
Change prefactor scale for 1836.2+5925 in the XML to 1E-9, rerun gtsrcmaps and gtlike
From the very beginning the one problem that concerned me was the issue of convergence with NewMinuit. Although not as common, Log(likelihood) can be negative. I have now downloaded your data (15 degree ROI), I have chosen the simplest model
python make4FGLxml.py gll_psc_v26.xml j1835_binned_gti.fits -o j1835_input.xml -r 1.0
Change the Prefactor scale for 4FGL J1836.2+5925 to 1E-9. Then ran the fit with Minuit in python as follows
import pyLikelihood from BinnedAnalysis import * obs = BinnedObs(srcMaps='j1835long_binned_srcmaps.fits',binnedExpMap='j1835long_binned_expcube.fits',expCube='j1835long_binned_ltcube.fits',irfs='CALDB') like1 = BinnedAnalysis(obs,'j1835long_input_model.xml',optimizer='Minuit') like1.tol = 0.1 like1obj = pyLike.Minuit(like1.logLike) like1.fit(verbosity=0,covar=True,optObject=like1obj) -1109422.9208761556 like1obj.getQuality() 3 like1.model['4FGL J1836.2+5925'] 4FGL J1836.2+5925 Spectrum: PLSuperExpCutoff2 465 Prefactor: 3.766e+00 6.641e-02 0.000e+00 1.000e+02 ( 1.000e-09) 466 Index1: 9.347e-01 1.943e-02 0.000e+00 3.000e+00 (-1.000e+00) 467 Scale: 9.817e+01 0.000e+00 5.791e+01 1.273e+03 ( 1.000e+00) fixed 468 Expfactor: 1.311e+01 1.749e+00 -1.000e+00 1.000e+02 ( 1.000e-03) 469 Index2: 6.617e-01 1.302e-02 0.000e+00 2.000e+00 ( 1.000e+00)
A fit quality = 3 corresponds to proper convergence in Minuit. I then ran with NewMinuit
like2 = BinnedAnalysis(obs,'j1835long_input_model.xml',optimizer='NewMinuit') like2obj = pyLike.Minuit(like2.logLike) like2.fit(verbosity=0,covar=True,optObject=like2obj) -1109422.9773595822 like2.model['4FGL J1836.2+5925'] 4FGL J1836.2+5925 Spectrum: PLSuperExpCutoff2 465 Prefactor: 3.799e+00 8.017e-02 0.000e+00 1.000e+02 ( 1.000e-09) 466 Index1: 9.246e-01 2.259e-02 0.000e+00 3.000e+00 (-1.000e+00) 467 Scale: 9.817e+01 0.000e+00 5.791e+01 1.273e+03 ( 1.000e+00) fixed 468 Expfactor: 1.414e+01 2.230e+00 -1.000e+00 1.000e+02 ( 1.000e-03) 469 Index2: 6.543e-01 1.535e-02 0.000e+00 2.000e+00 ( 1.000e+00)
like2obj.getRetCode() 0
This seems to converge properly according to Minuit. Please let me know if you get things to converge.
From the very beginning the one problem that concerned me was the issue of convergence with NewMinuit. Although not as common, Log(likelihood) can be negative. I have now downloaded your data (15 degree ROI), I have chosen the simplest model
python make4FGLxml.py gll_psc_v26.xml j1835_binned_gti.fits -o j1835_input.xml -r 1.0
Change the Prefactor scale for 4FGL J1836.2+5925 to 1E-9. Then ran the fit with Minuit in python as follows
import pyLikelihood from BinnedAnalysis import * obs = BinnedObs(srcMaps='j1835long_binned_srcmaps.fits',binnedExpMap='j1835long_binned_expcube.fits',expCube='j1835long_binned_ltcube.fits',irfs='CALDB') like1 = BinnedAnalysis(obs,'j1835long_input_model.xml',optimizer='Minuit') like1.tol = 0.1 like1obj = pyLike.Minuit(like1.logLike) like1.fit(verbosity=0,covar=True,optObject=like1obj) -1109422.9208761556 <------------------------------------------------ like1obj.getQuality() 3 like1.model['4FGL J1836.2+5925'] 4FGL J1836.2+5925 Spectrum: PLSuperExpCutoff2 465 Prefactor: 3.766e+00 6.641e-02 0.000e+00 1.000e+02 ( 1.000e-09) 466 Index1: 9.347e-01 1.943e-02 0.000e+00 3.000e+00 (-1.000e+00) 467 Scale: 9.817e+01 0.000e+00 5.791e+01 1.273e+03 ( 1.000e+00) fixed 468 Expfactor: 1.311e+01 1.749e+00 -1.000e+00 1.000e+02 ( 1.000e-03) 469 Index2: 6.617e-01 1.302e-02 0.000e+00 2.000e+00 ( 1.000e+00)
A fit quality = 3 corresponds to proper convergence in Minuit. I then ran with NewMinuit
like2 = BinnedAnalysis(obs,'j1835long_input_model.xml',optimizer='NewMinuit') like2obj = pyLike.Minuit(like2.logLike) like2.fit(verbosity=0,covar=True,optObject=like2obj) -1109422.9773595822 <-------------------------------------- like2.model['4FGL J1836.2+5925'] 4FGL J1836.2+5925 Spectrum: PLSuperExpCutoff2 465 Prefactor: 3.799e+00 8.017e-02 0.000e+00 1.000e+02 ( 1.000e-09) 466 Index1: 9.246e-01 2.259e-02 0.000e+00 3.000e+00 (-1.000e+00) 467 Scale: 9.817e+01 0.000e+00 5.791e+01 1.273e+03 ( 1.000e+00) fixed 468 Expfactor: 1.414e+01 2.230e+00 -1.000e+00 1.000e+02 ( 1.000e-03) 469 Index2: 6.543e-01 1.535e-02 0.000e+00 2.000e+00 ( 1.000e+00)
like2obj.getRetCode() 0
This seems to converge properly according to Minuit. Please let me know if you get things to converge.
But still -log(Likelihood)<0
As indicated in the gtlike documentation, the maximum likelihood method is based on Mattox J. R. et al 1996, ApJ 461, 396. The maximum likelihood function is: lnL = sum_ijk ( n_ijk ln(m_ijk) - m_ijk ), without the factorial term. So -ln L can be negative.
As indicated in the gtlike documentation, the maximum likelihood method is based on Mattox J. R. et al 1996, ApJ 461, 396. The maximum likelihood function is: lnL = sum_ijk ( n_ijk ln(m_ijk) - m_ijk ), without the factorial term. So -ln L can be negative.
If it's not difficult, could you send me please a link to this documentation?
You can access the documentation page of all the Fermi tools in: https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/references.html
You can access the documentation page of all the Fermi tools in: https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/references.html
thank you
Hello, this is code for building a function (counts-model)/model (Unbinned): '>>> resid = (like._Nobs() - sum_counts)/sum_counts' '>>> resid_err = (np.sqrt(like._Nobs())/sum_counts)' '>>> plt.figure(figsize=(9,9))' '>>> plt.xscale('log')' '>>> plt.errorbar(E,resid,yerr=resid_err,fmt='o')' '>>> plt.axhline(0.0,ls=':')' '>>> plt.show()' But _Nobs() method and the sum_count variable were not defined for binned analysis. What can I use instead? https://fermi.gsfc.nasa.gov/ssc/data/p7rep/analysis/scitools/python_tutorial.html (Run the Likelihood Analysis) P.S. I changed sum_counts to sum_model. It helped.
The p7rep link is out of date please use the following. https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/python_tutorial.html
Plot example should work the same for unbinned and binned analyses.
The p7rep link is out of date please use the following. https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/python_tutorial.html
Plot example should work the same for unbinned and binned analyses.
AttributeError: 'BinnedAnalysis' object has no attribute 'Nobs' [AttributeError.txt](https://github.com/fermi-lat/Likelihood/files/6243337/AttributeError_.txt)
P.S. may be I can use 'nobs'?
And another problem when building a TS map TSMap_RuntimeErr.txt
Thanks for catching that. There was a changed of attributes that was not reflected there. The most current version should be live now at
https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/python_tutorial.html
As for the TS map, for binned analysis the exposure map expmap needs to be created with gtexpcube2 as explained in the "Compute exposure map" section here:
https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/binned_likelihood_tutorial.html
Since these questions are more about usage than software issues, please email us directly at fermihelp@milkyway.gsfc.nasa.gov
Ok, Thanks
Hello, I have problems in the binned analysis. I run "gtlike refit=yes plot=no" and in end I see: -log(Likelihood): -9873275.449. That is value less than zero. If I run "gtlike refit=yes plot=yes", I see as well -log(Likelihood): -9873275.449 and "Floating point exception (core dumped)". Can you help me, please, with this problem? (Sorry for my bad English) floating_poinnt_exc.txt