Closed giacomomagni closed 1 year ago
Hi @LucaMantani (and @tgiani).
In the end I think we can substitute Multinest, by this python package called Ultranest
https://johannesbuchner.github.io/UltraNest/index.html
I have runned a benchmark of the Global smefit2.0 NLO HO fit and took 3.3 hours with the standard settings and 32 cores. The outcome is attached below.
I recall Multinest was a bit faster, but given how difficult is to install I would move to this new package. Do you agree? If yes I'll proceed to clean this PR and scratch away Multinest.
PD: The benchmark was successful also at linear level.
I think @giacomomagni that the benefits of a self-contained package installation largely dominate over a slightly worse performance, so I am all for it!
And I think potential new users will also appreciate this development ;)
hi Giacomo, thank you for this, it looks good to me and I think we should go for it
I don't know this package, is there a reason why you moved away from dynesty? You said that it was running faster
I am reading in the FAQ that the author claims that other packages have some biases and this is giving more faithful uncertainties, but to do so evaluates the likelihood more often and for that reason it is in general slower. So I think the slower performances are indeed expected.
Did you notice any difference in the results? Are posterior smoother or something like that?
Well the author is the same of MultinNest (ieJohannes Buchner).
Main reason to drop dynesty
is that I was not able to reproduce the old results, and I think they have some
intrisic diffrent feature:
https://dynesty.readthedocs.io/en/latest/faq.html
From what I saw I'm not sure that dynesty
will be faster for a global fits with quadratics.
The only downside of Ultranest
is that it still rely on MPI
so to run in parallel you will need to have conda around or have your independent installation of mpicc
.
Given that we can still deploy on pypi
if a user just need to run on a single core (or have mpicc
already installed).
Did you notice any difference in the results? Are posterior smoother or something like that?
Yes posteriors are a bit smoothers, you can see the file attached above..
Ok, thanks! Interesting effect in cpt, with UltraNest the two maxima are at the same footing while with MultiNest one is considered more important that the other!
Anyway, if you are happy with UltraNest and it makes installations easier, I'm all for it!
Merging #51 (ccd7218) into main (42a6cd9) will decrease coverage by
0.12%
. The diff coverage is46.66%
.
@@ Coverage Diff @@
## main #51 +/- ##
==========================================
- Coverage 43.39% 43.27% -0.12%
==========================================
Files 27 27
Lines 2187 2216 +29
==========================================
+ Hits 949 959 +10
- Misses 1238 1257 +19
Flag | Coverage Δ | |
---|---|---|
unittests | 43.27% <46.66%> (-0.12%) |
:arrow_down: |
Flags with carried forward coverage won't be shown. Click here to find out more.
Files | Coverage Δ | |
---|---|---|
src/smefit/optimize/__init__.py | 61.66% <77.77%> (+0.55%) |
:arrow_up: |
src/smefit/cli/__init__.py | 0.00% <0.00%> (ø) |
|
src/smefit/runner.py | 46.78% <27.77%> (-0.79%) |
:arrow_down: |
src/smefit/optimize/ultranest.py | 62.37% <62.16%> (ø) |
@tgiani this PR should be ready for review.
As said in the Readme it will be possible to install just with pip
but not if you want to run in parallel mode.
For that reason I'd keep the conda envs. MultiNest should not be need anymore.
Hi @giacomomagni, I m looking at this, but the installation script does not work anymore for me. TRying to run it on stoomboot I get the following
Package operations: 42 installs, 14 updates, 0 removals
• Updating numpy (1.25.1 /home/conda/feedstock_root/build_artifacts/numpy_1688887056611/work -> 1.24.4): Failed
CalledProcessError
Command '['/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/bin/python', '-m', 'pip', 'uninstall', 'numpy', '-y']' returned non-zero exit status 2.
at /data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/subprocess.py:524 in run
520│ # We don't call process.wait() as .__exit__ does that for us.
521│ raise
522│ retcode = process.poll()
523│ if check and retcode:
→ 524│ raise CalledProcessError(retcode, process.args,
525│ output=stdout, stderr=stderr)
526│ return CompletedProcess(process.args, retcode, stdout, stderr)
527│
528│
The following error occurred when trying to handle this error:
EnvCommandError
Command ['/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/bin/python', '-m', 'pip', 'uninstall', 'numpy', '-y'] errored with the following return code 2
Output:
Found existing installation: numpy 1.25.1
Uninstalling numpy-1.25.1:
Successfully uninstalled numpy-1.25.1
ERROR: Exception:
Traceback (most recent call last):
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_internal/cli/base_command.py", line 169, in exc_logging_wrapper
status = run_func(*args)
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_internal/commands/uninstall.py", line 110, in run
uninstall_pathset.commit()
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_internal/req/req_uninstall.py", line 432, in commit
self._moved_paths.commit()
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_internal/req/req_uninstall.py", line 278, in commit
save_dir.cleanup()
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_internal/utils/temp_dir.py", line 173, in cleanup
rmtree(self._path)
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_vendor/tenacity/__init__.py", line 291, in wrapped_f
return self(f, *args, **kw)
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_vendor/tenacity/__init__.py", line 381, in __call__
do = self.iter(retry_state=retry_state)
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_vendor/tenacity/__init__.py", line 327, in iter
raise retry_exc.reraise()
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_vendor/tenacity/__init__.py", line 160, in reraise
raise self.last_attempt.result()
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/concurrent/futures/_base.py", line 439, in result
return self.__get_result()
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/concurrent/futures/_base.py", line 391, in __get_result
raise self._exception
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_vendor/tenacity/__init__.py", line 384, in __call__
result = fn(*args, **kwargs)
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pip/_internal/utils/misc.py", line 130, in rmtree
shutil.rmtree(dir, ignore_errors=ignore_errors, onerror=rmtree_errorhandler)
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/shutil.py", line 722, in rmtree
_rmtree_safe_fd(fd, path, onerror)
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/shutil.py", line 655, in _rmtree_safe_fd
_rmtree_safe_fd(dirfd, fullname, onerror)
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/shutil.py", line 678, in _rmtree_safe_fd
onerror(os.unlink, fullname, sys.exc_info())
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/shutil.py", line 676, in _rmtree_safe_fd
os.unlink(entry.name, dir_fd=topfd)
OSError: [Errno 16] Device or resource busy: '.nfs000000007670e6af004f2de3'
at /data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/poetry/utils/env.py:1525 in _run
1521│ output = ""
1522│ else:
1523│ output = subprocess.check_output(cmd, stderr=stderr, env=env, **kwargs)
1524│ except CalledProcessError as e:
→ 1525│ raise EnvCommandError(e, input=input_)
1526│
1527│ return decode(output)
1528│
1529│ def execute(self, bin: str, *args: str, **kwargs: Any) -> int:
• Updating packaging (23.1 /home/conda/feedstock_root/build_artifacts/packaging_1681337016113/work -> 23.2)
• Updating setuptools (68.0.0 -> 68.2.2)
• Updating six (1.16.0 /home/conda/feedstock_root/build_artifacts/six_1620240208055/work -> 1.16.0)
• Updating tomli (2.0.1 /home/conda/feedstock_root/build_artifacts/tomli_1644342247877/work -> 2.0.1)
• Updating typing-extensions (4.7.1 /home/conda/feedstock_root/build_artifacts/typing_extensions_1688315532570/work -> 4.8.0)
Installing dependencies from lock file
Warning: poetry.lock is not consistent with pyproject.toml. You may be getting improper dependencies. Run `poetry lock [--no-update]` to fix it.
Package operations: 42 installs, 9 updates, 0 removals
• Installing numpy (1.24.4)
• Updating six (1.16.0 /home/conda/feedstock_root/build_artifacts/six_1620240208055/work -> 1.16.0)
• Installing contourpy (1.1.0)
• Installing cycler (0.11.0)
• Installing fonttools (4.41.0)
• Installing kiwisolver (1.4.4)
• Updating packaging (23.2 -> 23.1)
• Installing pillow (10.0.0)
• Installing pyparsing (3.0.9)
• Installing python-dateutil (2.8.2)
• Installing asttokens (2.2.1)
• Installing executing (1.2.0)
• Installing matplotlib (3.7.2)
• Installing parso (0.8.3)
• Updating ptyprocess (0.7.0 /home/conda/feedstock_root/build_artifacts/ptyprocess_1609419310487/work/dist/ptyprocess-0.7.0-py2.py3-none-any.whl -> 0.7.0)
• Installing pure-eval (0.2.2)
• Installing pyrepl (0.9.0)
• Installing pytz (2023.3)
• Installing traitlets (5.9.0)
• Installing wcwidth (0.2.6)
• Installing backcall (0.2.0)
• Updating colorama (0.4.6 /home/conda/feedstock_root/build_artifacts/colorama_1666700638685/work -> 0.4.6)
• Installing commonmark (0.9.1)
• Installing corner (2.2.1)
• Installing cython (0.29.36)
• Installing decorator (5.1.1)
• Updating distlib (0.3.6 /home/conda/feedstock_root/build_artifacts/distlib_1668356257807/work -> 0.3.6)
• Installing fancycompleter (0.9.1)
• Updating filelock (3.12.2 /home/conda/feedstock_root/build_artifacts/filelock_1686612785345/work -> 3.12.2)
• Installing jedi (0.18.2)
• Installing matplotlib-inline (0.1.6)
• Installing pandas (1.5.3)
• Updating pexpect (4.8.0 /home/conda/feedstock_root/build_artifacts/pexpect_1667297516076/work -> 4.8.0)
• Installing pickleshare (0.7.5)
• Updating platformdirs (3.8.1 /home/conda/feedstock_root/build_artifacts/platformdirs_1688739404342/work -> 3.8.1)
• Installing prompt-toolkit (3.0.39)
• Installing pygments (2.15.1)
• Installing scipy (1.10.1)
• Installing stack-data (0.6.2)
• Installing wmctrl (0.4)
• Installing asv (0.4.2): Preparing...
• Installing click (8.1.5)
• Installing cma (3.3.0)
• Installing devtools (0.10.0)
• Installing ipython (8.12.2)
• Installing pdbpp (0.10.3)
• Installing pyyaml (5.4.1): Failed
ChefBuildError
Backend subprocess exited when trying to invoke get_requires_for_build_wheel
/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
!!
********************************************************************************
The license_file parameter is deprecated, use license_files instead.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
********************************************************************************
!!
parsed = self.parsers.get(option_name, lambda x: x)(value)
running egg_info
• Installing click (8.1.5)
• Installing cma (3.3.0)
• Installing devtools (0.10.0)
• Installing ipython (8.12.2)
• Installing pdbpp (0.10.3)
• Installing pyyaml (5.4.1): Failed
ChefBuildError
Backend subprocess exited when trying to invoke get_requires_for_build_wheel
/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
!!
********************************************************************************
The license_file parameter is deprecated, use license_files instead.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
********************************************************************************
!!
parsed = self.parsers.get(option_name, lambda x: x)(value)
running egg_info
• Installing asv (0.4.2): Installing...
• Installing click (8.1.5)
• Installing cma (3.3.0)
• Installing devtools (0.10.0)
• Installing ipython (8.12.2)
• Installing pdbpp (0.10.3)
• Installing pyyaml (5.4.1): Failed
ChefBuildError
Backend subprocess exited when trying to invoke get_requires_for_build_wheel
/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
!!
********************************************************************************
The license_file parameter is deprecated, use license_files instead.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
********************************************************************************
!!
parsed = self.parsers.get(option_name, lambda x: x)(value)
running egg_info
• Installing click (8.1.5)
• Installing cma (3.3.0)
• Installing devtools (0.10.0)
• Installing ipython (8.12.2)
• Installing pdbpp (0.10.3)
• Installing pyyaml (5.4.1): Failed
ChefBuildError
Backend subprocess exited when trying to invoke get_requires_for_build_wheel
/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
!!
********************************************************************************
The license_file parameter is deprecated, use license_files instead.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
********************************************************************************
!!
parsed = self.parsers.get(option_name, lambda x: x)(value)
running egg_info
• Installing asv (0.4.2)
• Installing click (8.1.5)
• Installing cma (3.3.0)
• Installing devtools (0.10.0)
• Installing ipython (8.12.2)
• Installing pdbpp (0.10.3)
• Installing pyyaml (5.4.1): Failed
ChefBuildError
Backend subprocess exited when trying to invoke get_requires_for_build_wheel
/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
!!
********************************************************************************
The license_file parameter is deprecated, use license_files instead.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
********************************************************************************
!!
parsed = self.parsers.get(option_name, lambda x: x)(value)
running egg_info
writing lib3/PyYAML.egg-info/PKG-INFO
writing dependency_links to lib3/PyYAML.egg-info/dependency_links.txt
writing top-level names to lib3/PyYAML.egg-info/top_level.txt
Traceback (most recent call last):
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
main()
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 355, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 325, in _get_build_requires
self.run_setup()
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 341, in run_setup
exec(code, locals())
File "<string>", line 271, in <module>
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/__init__.py", line 103, in setup
return distutils.core.setup(**attrs)
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 185, in setup
return run_commands(dist)
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
dist.run_commands()
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
self.run_command(cmd)
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/dist.py", line 989, in run_command
super().run_command(command)
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/command/egg_info.py", line 318, in run
self.find_sources()
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/command/egg_info.py", line 326, in find_sources
mm.run()
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/command/egg_info.py", line 548, in run
self.add_defaults()
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/command/egg_info.py", line 586, in add_defaults
sdist.add_defaults(self)
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/command/sdist.py", line 113, in add_defaults
super().add_defaults()
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/_distutils/command/sdist.py", line 251, in add_defaults
self._add_defaults_ext()
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/_distutils/command/sdist.py", line 336, in _add_defaults_ext
self.filelist.extend(build_ext.get_source_files())
File "<string>", line 201, in get_source_files
File "/tmp/tmptc5jdgsi/.venv/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 107, in __getattr__
raise AttributeError(attr)
AttributeError: cython_sources
at /data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/poetry/installation/chef.py:147 in _prepare
143│
144│ error = ChefBuildError("\n\n".join(message_parts))
145│
146│ if error is not None:
→ 147│ raise error from None
148│
149│ return path
150│
151│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:
Note: This error originates from the build backend, and is likely not a problem with poetry but with pyyaml (5.4.1) not supporting PEP 517 builds. You can verify this by running 'pip wheel --use-pep517 "pyyaml (==5.4.1)"'.
• Installing rich (11.2.0)
• Installing seaborn (0.11.2)
• Installing ultranest (3.6.1): Failed
ChefBuildError
Backend subprocess exited when trying to invoke get_requires_for_build_wheel
Error compiling Cython file:
------------------------------------------------------------
...
# only consider points in the same cluster
if clusterids[j] == clusterids[i]:
pair_dist = 0.0
for k in range(ndim):
pair_dist += (pts[i,k] - pts[j,k])**2
total_dist += pair_dist**0.5
^
------------------------------------------------------------
ultranest/mlfriends.pyx:180:16: Cannot assign type 'npy_double complex' to 'float_t'
Compiling ultranest/mlfriends.pyx because it changed.
Compiling ultranest/stepfuncs.pyx because it changed.
[1/2] Cythonizing ultranest/mlfriends.pyx
Traceback (most recent call last):
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
main()
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
File "/tmp/tmp65oyp7bm/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 355, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
File "/tmp/tmp65oyp7bm/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 325, in _get_build_requires
self.run_setup()
File "/tmp/tmp65oyp7bm/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 507, in run_setup
super(_BuildMetaLegacyBackend, self).run_setup(setup_script=setup_script)
File "/tmp/tmp65oyp7bm/.venv/lib/python3.10/site-packages/setuptools/build_meta.py", line 341, in run_setup
exec(code, locals())
File "<string>", line 58, in <module>
File "/tmp/tmp65oyp7bm/.venv/lib/python3.10/site-packages/Cython/Build/Dependencies.py", line 1154, in cythonize
cythonize_one(*args)
File "/tmp/tmp65oyp7bm/.venv/lib/python3.10/site-packages/Cython/Build/Dependencies.py", line 1321, in cythonize_one
raise CompileError(None, pyx_file)
Cython.Compiler.Errors.CompileError: ultranest/mlfriends.pyx
at /data/theorie/tgiani/miniconda3/envs/smefit_ultranest/lib/python3.10/site-packages/poetry/installation/chef.py:147 in _prepare
143│
144│ error = ChefBuildError("\n\n".join(message_parts))
145│
146│ if error is not None:
→ 147│ raise error from None
148│
149│ return path
150│
151│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:
Note: This error originates from the build backend, and is likely not a problem with poetry but with ultranest (3.6.1) not supporting PEP 517 builds. You can verify this by running 'pip wheel --use-pep517 "ultranest (==3.6.1)"'.
• Updating virtualenv (20.23.1 /home/conda/feedstock_root/build_artifacts/virtualenv_1687005325630/work -> 20.23.1)
Installation was successful !!
To start type:
conda activate smefit_ultranest
smefit -h
is the installation script working without any problem for you?
some comments:
Thanks @tgiani for regenerating the conda files, most likely I made something wrong when doing it myself.
@giacomomagni mm no I think you did everything correctly, but then for some reason a different version of pyyaml was being installed following the pyproject.toml and this was giving problems, not sure. Could you maybe try to install the code both on the cluster and locally and see if it works for you?
I ve tried to run a smefit2.0 fit with quad corrections, asking for 32 cores, but it seems much slower. When you tested this branch for you it was taking just sligthly more than the old ns, right?
I ve tried to run a smefit2.0 fit with quad corrections, asking for 32 cores, but it seems much slower. When you tested this branch for you it was taking just sligthly more than the old ns, right?
yes, how much slower was it for you?
@giacomomagni quite a lot, I think it took smt like 8 hrs. But maybe I ve done something wrong. Which runcard have you used? To parallelise it s just the same as with the old ns no?
@giacomomagni quite a lot, I think it took smt like 8 hrs. But maybe I ve done something wrong. Which runcard have you used? To parallelise it s just the same as with the old ns no?
yes it should be the same... For me it took roughly 3 hours and 20 mins with 32 cores. I used
ok thank you, let me try again
ok I ve reinstalled everything and tried again and now with the runcard you gave it s taking around 3 hrs and results look fine if compared with the old ns
@giacomomagni so I m happy to merge if you re fine with that
Hi @giacomomagni , what settings did you use in Ultranest to obtain those smooth histograms? I am experiencing very sharp peaks with the same plotting settings, see attached: coefficient_histo.pdf. My settings are:
nlive: 400 lepsilon: 0.05 target_evidence_unc: 0.5 target_post_unc: 0.5 frac_remain: 0.01 store_raw: false
Maybe try to use higher nlive: 500
or nlive: 1000
Here there a some more details about each parameter, in case it helps.
Thanks @giacomomagni. The point it seems is that I get too many samples actually, around 70k now with live: 500
. I can adjust the way the histograms are plotted, but I am just curious which settings you used at the time. That might be easier, do you remember? I expect the option frac_remain
might be higher in your setup - I tried with 0.5, but perhaps that's even too low.
Thanks @giacomomagni. The point it seems is that I get too many samples actually, around 70k now with
live: 500
. I can adjust the way the histograms are plotted, but I am just curious which settings you used at the time. That might be easier, do you remember? I expect the optionfrac_remain
might be higher in your setup - I tried with 0.5, but perhaps that's even too low.
I used the runcard that is posted above ( clipped in the reply to Tommaso ... ), but they looks as yours except for nlive.
https://github.com/LHCfitNikhef/smefit_release/pull/51#issuecomment-1766596523
This PR is to replace
MultiNest
withdynesty
gitHub as suggested by @LucaMantani.This way the installation will be much simpler, finally being able to release on pypi!! Moreover this package is newer and more maintained both for documentation and development.
https://dynesty.readthedocs.io/en/latest/index.html