Closed kevinkhu closed 1 year ago
Hi @kevinkhu and @rachelbf, have you solved the problem?
Hi @jvines, I have the same problem. I checked the example on the astroARIADNE website, and also changed the prior (f.prior_setup); both failed. It first stops and after several minutes, returns
FITTING MODEL : phoenix
12169it [32:07, 6.31it/s, batch: 0 | bound: 154 | nc: 1 | ncall: 266017 | eff(%): 4.575 | loglstar: -inf < 355.627 < inf | logz: 332.782 +/- nan | dlogz: 0.001 > 0.500]
RemoteTraceback Traceback (most recent call last) RemoteTraceback: """ Traceback (most recent call last): File "/usr/local/lib/python3.7/multiprocessing/pool.py", line 121, in worker result = (True, func(*args, *kwds)) File "/usr/local/lib/python3.7/multiprocessing/pool.py", line 44, in mapstar return list(map(args)) File "/usr/local/lib/python3.7/site-packages/dynesty/sampling.py", line 189, in sample_rwalk loglstar, logl_prop, axes, scale)) RuntimeError: Random walk sampling appears to be stuck! Some useful output quantities: u: [0.39942317 0.69327559 0.20558447 0.44065761 0.00646845 0.09414479 0.32556036 0.00524377 0.38987005 0.47793371 0.48199689 0.48752014 0.09118343 0.82103658 0.06246131 0.05100845 0.11040518 0.07308471 0.30352457] drhat: [-0.4744149 -0.30121499 0.08521982 -0.50928285 -0.16396939 0.01602664 -0.02894968 0.06436208 0.13600576 -0.1637623 -0.03602273 -0.14641377 0.38990459 0.0403197 -0.19854023 -0.20282331 0.02178749 0.28744069 -0.00955563] dr: [-0.40873007 -0.25951045 0.07342076 -0.4387704 -0.14126711 0.01380768 -0.02494147 0.05545086 0.11717516 -0.14108869 -0.03103523 -0.12614214 0.33592058 0.03473726 -0.17105146 -0.17474153 0.01877092 0.24764326 -0.00823261] du: [-7.29431721e-02 -8.11528750e-02 -3.85922146e-01 -4.32314346e-01 1.29414141e-04 3.64833058e-02 -1.24563655e-02 -3.87442944e-02 4.45363035e-02 2.25000824e-01 -1.42771662e-01 4.37310040e-02 1.97814625e-01 -3.70984195e-02 -2.59165679e-01 -3.22602647e-01 -5.15949724e-02 1.35153900e-01 -1.01166081e-01] u_prop: [0.39942303 0.69327543 0.2055837 0.44065675 0.00646845 0.09414487 0.32556034 0.00524369 0.38987014 0.47793415 0.48199661 0.48752022 0.09118382 0.82103651 0.0624608 0.05100781 0.11040508 0.07308498 0.30352437] loglstar: 355.0710298630696 logl_prop: 352.6536813861734 axes: [[ 1.78462946e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-2.33994316e-01 6.81257308e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 6.04128743e-01 6.68487160e-01 4.69660599e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.32582512e-01 4.65790288e-02 -2.41727514e-01 7.93782638e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-5.39038804e-04 5.07770837e-05 -7.66793715e-07 1.38719305e-04 1.18978992e-04 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 8.33530000e-02 1.82675704e-01 -5.92425116e-02 -2.29611759e-01 -9.03332118e-02 6.37334642e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.29243964e-01 -1.31994860e-01 -1.25556298e-01 -4.68397969e-02 -8.88968473e-02 -5.96385343e-02 6.79700240e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.57291520e-01 2.52913005e-01 1.09596209e-02 -2.28588468e-02 -1.06366160e-01 -1.82160478e-01 -4.90283992e-01 1.00278630e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-1.81910557e-01 -3.24702537e-02 3.93598595e-02 3.00444882e-01 -4.20789817e-02 -4.99074108e-02 4.20189427e-02 -9.63936794e-03 7.42660485e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-3.87202184e-01 2.02470463e-01 -5.36406369e-02 -4.06201285e-01 -1.75503163e-01 -9.06616074e-02 6.47881377e-02 7.16407097e-02 2.31856197e-01 7.65995697e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-2.39693170e-01 3.12375681e-01 -4.94482993e-02 1.79919713e-01 3.25997872e-01 -2.37439057e-01 -2.10501992e-01 -4.48568458e-01 -1.03726047e-01 -2.03284882e-01 7.95073421e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 4.62751711e-02 -4.02581337e-01 4.55642699e-02 -2.04590412e-02 -3.34843658e-01 1.07550332e-01 9.56513762e-03 -1.66955945e-01 -3.59669313e-02 7.74717972e-03 2.42355350e-01 6.39066077e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 6.08413344e-02 -1.22219734e-01 3.54619743e-02 -4.48326814e-02 -1.49129542e-01 -2.78916463e-02 -3.11673327e-02 2.45879125e-02 1.08959495e-01 1.16024697e-01 -1.70201737e-02 -5.63760133e-02 4.22213372e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.30980529e-01 -9.12770218e-02 -3.72887815e-01 4.76170674e-02 -1.52472266e-01 -2.19676954e-01 2.78539154e-02 2.98278913e-02 3.05129319e-02 7.91975107e-02 5.72418889e-02 -1.80288572e-02 4.25248259e-02 4.13708523e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.28000180e-01 -6.66204017e-02 7.76956795e-02 -3.11640890e-02 1.57270539e-01 -2.06583993e-01 -1.11076103e-01 -2.61295287e-01 -2.14071485e-01 -2.56619665e-01 -1.86187099e-01 2.23432855e-01 -1.22244755e-01 -2.24353160e-02 8.97971940e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.04848247e-01 2.25427566e-02 -6.56642891e-02 2.29225133e-02 4.88609241e-02 -2.67045734e-01 -2.22555484e-01 7.98177512e-02 -2.79283381e-01 1.88342738e-01 -1.34618390e-01 -2.06678386e-01 -2.92027329e-01 2.58391478e-03 2.00605480e-01 5.55304517e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 2.21040793e-01 -7.45902654e-02 1.59020426e-01 2.19283338e-01 5.53021441e-02 8.15407066e-03 -1.55561329e-01 -1.19917474e-01 -1.34120650e-01 -9.97139413e-03 -1.45827164e-01 -3.19574428e-01 -5.26006606e-02 1.51854048e-01 -3.49607792e-01 -1.53178037e-01 5.16834172e-01 0.00000000e+00 0.00000000e+00] [-7.18702176e-02 8.81390823e-02 -2.16232547e-01 -4.49145122e-02 1.89520155e-01 -9.09387808e-02 7.74790143e-02 -3.47961040e-02 2.61986292e-02 -2.86621403e-01 -2.60185087e-01 3.33241960e-02 -1.76119917e-02 1.07109084e-01 9.47062282e-03 -7.93822309e-02 -1.12229244e-01 4.09384576e-01 0.00000000e+00] [-1.09018711e-01 2.90505929e-01 1.45412589e-01 -1.29631296e-01 1.02772619e-01 1.07768380e-01 2.00163679e-02 2.06846840e-01 -6.75603555e-02 -2.75899589e-02 -1.54493456e-01 9.18549810e-02 -1.88447266e-01 -6.30115782e-02 -2.55282084e-02 4.82031289e-02 -9.35735494e-02 -1.99884520e-01 5.11375636e-01]] scale: 1.8734013431785728e-06. """
The above exception was the direct cause of the following exception:
RuntimeError Traceback (most recent call last) /tmp/ipykernel_30053/3764429830.py in
----> 1 f.fit_bma() /usr/local/lib/python3.7/site-packages/astroARIADNE-0.5.0-py3.7.egg/astroARIADNE/fitter.py in fit_bma(self) 599 print('\t\t\tFITTING MODEL : ' + gr) 600 try: --> 601 self.fit_dynesty(out_file=out_file) 602 except ValueError as e: 603 dump_out = self.out_folder + '/' + gr + '_DUMP.pkl'
/usr/local/lib/python3.7/site-packages/astroARIADNE-0.5.0-py3.7.egg/astroARIADNE/fitter.py in fit_dynesty(self, out_file) 727 self.sampler.run_nested(dlogz_init=self._dlogz, 728 nlive_init=self._nlive, --> 729 wt_kwargs={'pfrac': 1}) 730 else: 731 self.sampler = dynesty.DynamicNestedSampler(
/usr/local/lib/python3.7/site-packages/dynesty/dynamicsampler.py in run_nested(self, nlive_init, maxiter_init, maxcall_init, dlogz_init, logl_max_init, n_effective_init, nlive_batch, wt_function, wt_kwargs, maxiter_batch, maxcall_batch, maxiter, maxcall, maxbatch, n_effective, stop_function, stop_kwargs, use_stop, save_bounds, print_progress, print_func, live_points) 1668 print_progress=print_progress, 1669 print_func=print_func, -> 1670 stop_val=stop_val) 1671 ncall, niter, logl_bounds, results = passback 1672 elif logl_bounds[1] != np.inf:
/usr/local/lib/python3.7/site-packages/dynesty/dynamicsampler.py in add_batch(self, nlive, wt_function, wt_kwargs, maxiter, maxcall, logl_bounds, save_bounds, print_progress, print_func, stop_val) 1773 maxiter=maxiter, 1774 maxcall=maxcall, -> 1775 save_bounds=save_bounds): 1776 (worst, ustar, vstar, loglstar, nc, 1777 worst_it, boundidx, bounditer, eff) = results
/usr/local/lib/python3.7/site-packages/dynesty/dynamicsampler.py in sample_batch(self, nlive_new, update_interval, logl_bounds, maxiter, maxcall, save_bounds) 1150 for i in range(nlive_new): 1151 (live_u[i], live_v[i], live_logl[i], -> 1152 live_nc[i]) = self.sampler._new_point(logl_min, math.log(vol)) 1153 live_it[i] = self.it 1154 self.ncall += live_nc[i]
/usr/local/lib/python3.7/site-packages/dynesty/sampler.py in _new_point(self, loglstar, logvol) 384 while True: 385 # Get the next point from the queue --> 386 u, v, logl, nc, blob = self._get_point_value(loglstar) 387 ncall += nc 388
/usr/local/lib/python3.7/site-packages/dynesty/sampler.py in _get_point_value(self, loglstar) 368 # If the queue is empty, refill it. 369 if self.nqueue <= 0: --> 370 self._fill_queue(loglstar) 371 372 # Grab the earliest entry.
/usr/local/lib/python3.7/site-packages/dynesty/sampler.py in _fill_queue(self, loglstar) 357 if self.use_pool_evolve: 358 # Use the pool to propose ("evolve") a new live point. --> 359 self.queue = list(self.M(evolve_point, args)) 360 else: 361 # Propose ("evolve") a new live point using the default
map
/usr/local/lib/python3.7/multiprocessing/pool.py in map(self, func, iterable, chunksize) 288 in a list that is returned. 289 ''' --> 290 return self._map_async(func, iterable, mapstar, chunksize).get() 291 292 def starmap(self, func, iterable, chunksize=None):
/usr/local/lib/python3.7/multiprocessing/pool.py in get(self, timeout) 681 return self._value 682 else: --> 683 raise self._value 684 685 def _set(self, i, obj):
RuntimeError: Random walk sampling appears to be stuck! Some useful output quantities: u: [0.39942317 0.69327559 0.20558447 0.44065761 0.00646845 0.09414479 0.32556036 0.00524377 0.38987005 0.47793371 0.48199689 0.48752014 0.09118343 0.82103658 0.06246131 0.05100845 0.11040518 0.07308471 0.30352457] drhat: [-0.4744149 -0.30121499 0.08521982 -0.50928285 -0.16396939 0.01602664 -0.02894968 0.06436208 0.13600576 -0.1637623 -0.03602273 -0.14641377 0.38990459 0.0403197 -0.19854023 -0.20282331 0.02178749 0.28744069 -0.00955563] dr: [-0.40873007 -0.25951045 0.07342076 -0.4387704 -0.14126711 0.01380768 -0.02494147 0.05545086 0.11717516 -0.14108869 -0.03103523 -0.12614214 0.33592058 0.03473726 -0.17105146 -0.17474153 0.01877092 0.24764326 -0.00823261] du: [-7.29431721e-02 -8.11528750e-02 -3.85922146e-01 -4.32314346e-01 1.29414141e-04 3.64833058e-02 -1.24563655e-02 -3.87442944e-02 4.45363035e-02 2.25000824e-01 -1.42771662e-01 4.37310040e-02 1.97814625e-01 -3.70984195e-02 -2.59165679e-01 -3.22602647e-01 -5.15949724e-02 1.35153900e-01 -1.01166081e-01] u_prop: [0.39942303 0.69327543 0.2055837 0.44065675 0.00646845 0.09414487 0.32556034 0.00524369 0.38987014 0.47793415 0.48199661 0.48752022 0.09118382 0.82103651 0.0624608 0.05100781 0.11040508 0.07308498 0.30352437] loglstar: 355.0710298630696 logl_prop: 352.6536813861734 axes: [[ 1.78462946e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-2.33994316e-01 6.81257308e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 6.04128743e-01 6.68487160e-01 4.69660599e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.32582512e-01 4.65790288e-02 -2.41727514e-01 7.93782638e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-5.39038804e-04 5.07770837e-05 -7.66793715e-07 1.38719305e-04 1.18978992e-04 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 8.33530000e-02 1.82675704e-01 -5.92425116e-02 -2.29611759e-01 -9.03332118e-02 6.37334642e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.29243964e-01 -1.31994860e-01 -1.25556298e-01 -4.68397969e-02 -8.88968473e-02 -5.96385343e-02 6.79700240e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.57291520e-01 2.52913005e-01 1.09596209e-02 -2.28588468e-02 -1.06366160e-01 -1.82160478e-01 -4.90283992e-01 1.00278630e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-1.81910557e-01 -3.24702537e-02 3.93598595e-02 3.00444882e-01 -4.20789817e-02 -4.99074108e-02 4.20189427e-02 -9.63936794e-03 7.42660485e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-3.87202184e-01 2.02470463e-01 -5.36406369e-02 -4.06201285e-01 -1.75503163e-01 -9.06616074e-02 6.47881377e-02 7.16407097e-02 2.31856197e-01 7.65995697e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [-2.39693170e-01 3.12375681e-01 -4.94482993e-02 1.79919713e-01 3.25997872e-01 -2.37439057e-01 -2.10501992e-01 -4.48568458e-01 -1.03726047e-01 -2.03284882e-01 7.95073421e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 4.62751711e-02 -4.02581337e-01 4.55642699e-02 -2.04590412e-02 -3.34843658e-01 1.07550332e-01 9.56513762e-03 -1.66955945e-01 -3.59669313e-02 7.74717972e-03 2.42355350e-01 6.39066077e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 6.08413344e-02 -1.22219734e-01 3.54619743e-02 -4.48326814e-02 -1.49129542e-01 -2.78916463e-02 -3.11673327e-02 2.45879125e-02 1.08959495e-01 1.16024697e-01 -1.70201737e-02 -5.63760133e-02 4.22213372e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.30980529e-01 -9.12770218e-02 -3.72887815e-01 4.76170674e-02 -1.52472266e-01 -2.19676954e-01 2.78539154e-02 2.98278913e-02 3.05129319e-02 7.91975107e-02 5.72418889e-02 -1.80288572e-02 4.25248259e-02 4.13708523e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.28000180e-01 -6.66204017e-02 7.76956795e-02 -3.11640890e-02 1.57270539e-01 -2.06583993e-01 -1.11076103e-01 -2.61295287e-01 -2.14071485e-01 -2.56619665e-01 -1.86187099e-01 2.23432855e-01 -1.22244755e-01 -2.24353160e-02 8.97971940e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 1.04848247e-01 2.25427566e-02 -6.56642891e-02 2.29225133e-02 4.88609241e-02 -2.67045734e-01 -2.22555484e-01 7.98177512e-02 -2.79283381e-01 1.88342738e-01 -1.34618390e-01 -2.06678386e-01 -2.92027329e-01 2.58391478e-03 2.00605480e-01 5.55304517e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 2.21040793e-01 -7.45902654e-02 1.59020426e-01 2.19283338e-01 5.53021441e-02 8.15407066e-03 -1.55561329e-01 -1.19917474e-01 -1.34120650e-01 -9.97139413e-03 -1.45827164e-01 -3.19574428e-01 -5.26006606e-02 1.51854048e-01 -3.49607792e-01 -1.53178037e-01 5.16834172e-01 0.00000000e+00 0.00000000e+00] [-7.18702176e-02 8.81390823e-02 -2.16232547e-01 -4.49145122e-02 1.89520155e-01 -9.09387808e-02 7.74790143e-02 -3.47961040e-02 2.61986292e-02 -2.86621403e-01 -2.60185087e-01 3.33241960e-02 -1.76119917e-02 1.07109084e-01 9.47062282e-03 -7.93822309e-02 -1.12229244e-01 4.09384576e-01 0.00000000e+00] [-1.09018711e-01 2.90505929e-01 1.45412589e-01 -1.29631296e-01 1.02772619e-01 1.07768380e-01 2.00163679e-02 2.06846840e-01 -6.75603555e-02 -2.75899589e-02 -1.54493456e-01 9.18549810e-02 -1.88447266e-01 -6.30115782e-02 -2.55282084e-02 4.82031289e-02 -9.35735494e-02 -1.99884520e-01 5.11375636e-01]] scale: 1.8734013431785728e-06.
I also noticed that several people (include you) have asked this question on dynesty website (e.g., #140 , #304). So now has it solved?
Thanks!
Hi!
I'm extremely sorry for the late reply. For some reason, I don't get emails when an issue is opened and I got this response only..
What I can advise on is to disable the dynamic sampler with dynamic=False. It can be very finicky.
About the random walker getting stuck.. it can happen sometimes due to the priors, in case you're not using the 'default' ones. In some other cases, it can occur if for some reason one of the photometry points ariadne got isn't from the star or is extremely bad quality and for whatever reason skipped the quality checks. I can help troubleshoot this a bit if you give me the star you're using
Hi!
I'm extremely sorry for the late reply. For some reason, I don't get emails when an issue is opened and I got this response only..
What I can advise on is to disable the dynamic sampler with dynamic=False. It can be very finicky.
About the random walker getting stuck.. it can happen sometimes due to the priors, in case you're not using the 'default' ones. In some other cases, it can occur if for some reason one of the photometry points ariadne got isn't from the star or is extremely bad quality and for whatever reason skipped the quality checks. I can help troubleshoot this a bit if you give me the star you're using
Hi @jvines Thanks for your reply. I disable the dynamic sampler, and it works! Thank you very much!
In addition, some guys change the dlogz=0.5 to dlogz=0.1, the "sampling stuck" problem also disappears, but the fitting process costs much longer time than your method.
So, reducing the dlogz means making the stopping criterion of dynesty to be more strict. Naturally the algorithm will take longer to reach said stopping criterion. That in turn might make the dynamic sampler work correctly, but that's anyone's guess, sadly. I think I'll remove the dynamic sampler from the tutorial to avoid having this happen to more people
I'll close this now. Feel free to reopen if anything comes up.
Both @rachelbf and I are trying to run the example code and are getting stuck at the same spot.
For both of us, the code runs for several minutes then just stops and doesn't do anything (we tested it overnight and it still didn't change).
We've tested it on both Mac and Linux running Python 3.7, and different numbers of threads. Both default and tuned priors fail. Any ideas on why it's getting stuck?