hpparvi / ldtk

Python toolkit for calculating stellar limb darkening profiles and model-specific coefficients using the stellar atmosphere spectrum library by Husser et al. (2013). Described in Parviainen & Aigrain, MNRAS 453, 3821–3826 (2015).
GNU General Public License v2.0
27 stars 18 forks source link

create_profiles() throws "TypeError" #26

Closed iancrossfield closed 3 years ago

iancrossfield commented 3 years ago

When I follow the examples at https://github.com/hpparvi/ldtk/blob/master/notebooks/01_Example_basics.ipynb (or the other sample notebooks), the code crashes at

ps = sc.create_profiles(nsamples=2000)

which results in TypeError: expected dtype object, got 'numpy.dtype[float64]'

This is PyLDTK v1.7.0, NumPy v1.21.2, and Python v3.7.6, running on Ubuntu 20.04 LTS.

The full error traceback is:

In [4]: ps = sc.create_profiles(nsamples=2000) ...:

TypeError Traceback (most recent call last)

in ----> 1 ps = sc.create_profiles(nsamples=2000) ~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in create_profiles(self, nsamples, teff, logg, metal) 484 self.ldp_samples[iflt, :, :] = self.itps[iflt](samples) 485 --> 486 return LDPSet(self.filter_names, self.mu, self.ldp_samples) 487 488 @property ~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in __init__(self, filters, mu, ldp_samples) 114 self._em = 1.0 115 --> 116 self.fit_limb() 117 self.resample_linear_mu() 118 ~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in fit_limb(self) 181 mu_new = linspace(self._mu_orig[0], 1, 1500) 182 flux_new = interp1d(self._mu_orig, self._mean_orig.mean(0), 'quadratic')(mu_new) --> 183 res = minimize(minfun, array([0.05, 0.15, 0.5, 1.5]), (mu_new, flux_new), method='Nelder-Mead') 184 self._limb_minimization = res 185 self.set_limb_mu(res.x[1]) ~/anaconda3/lib/python3.7/site-packages/scipy/optimize/_minimize.py in minimize(fun, x0, args, method, jac, hess, hessp, bounds, constraints, tol, callback, options) 610 if meth == 'nelder-mead': 611 return _minimize_neldermead(fun, x0, args, callback, bounds=bounds, --> 612 **options) 613 elif meth == 'powell': 614 return _minimize_powell(fun, x0, args, callback, bounds, **options) ~/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py in _minimize_neldermead(func, x0, args, callback, maxiter, maxfev, disp, return_all, initial_simplex, xatol, fatol, adaptive, bounds, **unknown_options) 748 749 for k in range(N + 1): --> 750 fsim[k] = func(sim[k]) 751 752 ind = np.argsort(fsim) ~/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py in function_wrapper(x, *wrapper_args) 462 def function_wrapper(x, *wrapper_args): 463 ncalls[0] += 1 --> 464 return function(np.copy(x), *(wrapper_args + args)) 465 466 return ncalls, function_wrapper ~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in minfun(x, mu, flux) 178 def fit_limb(self): 179 def minfun(x, mu, flux): --> 180 return ((flux - ldm_with_edge(mu, x[0], x[1], x[2:])) ** 2).sum() 181 mu_new = linspace(self._mu_orig[0], 1, 1500) 182 flux_new = interp1d(self._mu_orig, self._mean_orig.mean(0), 'quadratic')(mu_new) ~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in ldm_with_edge(mu, e0, e1, ldc, ldm) 78 return full_like(mu, inf) 79 nmu = clip((mu-e1)/(1-e1), 0.0, 1.0) ---> 80 return smootherstep(mu, e0, e1) * ldm(nmu, ldc) 81 82 TypeError: expected dtype object, got 'numpy.dtype[float64]'
hpparvi commented 3 years ago

Hi Ian,

Can you still give me your numba version, I haven't been able to reproduce this bug.

Cheers, Hannu

iancrossfield commented 3 years ago

Hannu: I'm running numba v0.48.0. Thanks for looking in to it! -Ian

hpparvi commented 3 years ago

Sorry for taking this long... I've managed to reproduce the bug and it seems to be coming from the old numba version. Can you upgrade to newer numba (I'm using 0.53.0) and let me know if this fixed the issue?

iancrossfield commented 3 years ago

@hpparvi - upgrading to numba 0.54 did the trick. Sorry that the trouble was just on my end, all along... thanks for taking the time to look into this. Take care!

hpparvi commented 3 years ago

Great to hear this is sorted! :)