Closed A2P2 closed 5 months ago
Check out this pull request on
See visual diffs & provide feedback on Jupyter Notebooks.
Powered by ReviewNB
Sorry for the late response, @A2P2! Somehow I thought this PR is WIP. I'll review the tutorial this week.
@fehiepsi No problem, take your time. I was trying to figure out why make docs
failed, but was a bit stuck with pandoc.
Thanks for the provided feedback. I've incorporated all points. Somehow rhats got higher than before and the committed ipynb has a json error. I'll fix it and ping you.
@fehiepsi I've incorporated you feedback points, the outputs are save in the notebook now as well. A bit too many commits because I didn't know that one shouldn't use black formatting for ipynb files.
A side question, I've checked the notebook locally and in colab and the parameter estimation is quite bad locally in my case. See the screenshots below. Do you know why? Is it due to different jaxlib/python versions?
Colab: Python 3.10.12 numpyro 0.13.2 jax 0.4.23 jaxlib 0.4.23+cuda12.cudnn89
Locally: Python 3.9.5 numpyro 0.13.2 jax 0.4.24 jaxlib 0.4.24
Hi @A2P2, I'm not sure why. The ess looks good on my system, which has jax 0.4.21. However, it seems to me that the colab gives more reasonable results.
@A2P2 I got good results (in terms of ess and estimated mean) with max_tree_depth=10
.
@fehiepsi Glad it works for you. I did upload the colab version since it made more sense to me.
could you set max tree depth to 10? I'm seeing that it is the main reason causing low ess in colab.
@fehiepsi with tree depth 10 the estimation is fairly slow. The reason I set it to 4 was to speed up the estimation a bit, while sacrificing some quality.
I would maybe leave it at 4 and add a note saying that it should be increased for better estimation quality. What do you think?
@A2P2 The following settings seem to work very well (not slow) in my system
n_datasets = 3
odeint_with_kwargs = functools.partial(
odeint, rtol=1e-6, atol=1e-5, mxstep=1000,
)
max_tree_depth=10,
Using n_datasets=3 also makes the plots clearer.
Hi @A2P2 - We are going to make a new release soon. As discussed above, the content looks great. We just need to adjust a couple of settings to get more consistent results and to visualize better. Do you want to incorporate those small changes before the release?
My bad, thanks for reminding. Will do today or tomorrow.On 6 May 2024, at 21:05, Du Phan @.***> wrote: Hi @A2P2 - We are going to make a new release soon. As discussed above, the content looks great. We just need to adjust a couple of settings to get more consistent results and to visualize better. Do you want to incorporate those small changes before the release?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
@fehiepsi I've incorporated your latest suggestions. Thanks for finding better ODE solver parameters. I just forgot that I've tuned it to so small values. Let me know if there is anything else.
@A2P2 Could you run make format to pass the lint checks? It seems that there are many unused imports.
@fehiepsi formatted now. I had the old version of the Makefile with black there, formatted with ruff now.
The tutorial mentioned in https://github.com/pyro-ppl/numpyro/issues/1450. It extends the existing example https://num.pyro.ai/en/stable/examples/ode.html by includin integration of multiple initial conditions and treating different data imperfections. Please suggest what to fix/improve.