Closed martindevora closed 3 years ago
OK, let me try to explain. TLS requires a minimum transit depth, in order not to search into the infinity of the noise floor. The default setting is a depth of 10ppm. If no test model of 10 ppm depth "wins" over the flat line (within one trial transit period), the fallback kicks in. The fallback takes the number of data points as chi squared red (for dy=1). Compare Figure 3 panel a in the paper. That chi2red stat is where all else follows from. Every trial period must have some chi2red to create a periodogram.
Ok, I understand the main idea. Then, if I'm not wrong, that is why you use the running mean, because you want an estimate of the transit depth to decide whether is worth running the least squares algorithm?
Thanks four your quick answer : )
What running mean?
The one that is entered as input to lowest_residuals_in_this_duration method as first parameter:
mean=1 - running_mean(patched_data, duration),
transit_depth_min=transit_depth_min,
patched_data_arr=patched_data,
duration=duration,
signal=lc_arr[chosen_transit_row],
inverse_squared_patched_dy_arr=inverse_squared_patched_dy,
overshoot=lc_cache_overview["overshoot"][chosen_transit_row],
ootr=out_of_transit_residuals(
patched_data, duration, inverse_squared_patched_dy
),
summed_edge_effect_correction=this_edge_effect_correction,
chosen_transit_row=chosen_transit_row,
datapoints=len(flux),
T0_fit_margin=T0_fit_margin,
)`
and then used into the method
``` for i in range(len(mean)):
if (mean[i] > transit_depth_min) and (i % xth_point == 0):`
Describe the bug core.lowest_residuals_in_this_duration returns the same value for all durations in core.search_period calls. I don't know why this is happening. It seems this is intended as documentation of the method says: "if nothing is fit, we fit a straight line: signal=1. Then, at dy=1, the squared sum of residuals equals the number of datapoints"
To Reproduce I don't know the real reason that triggers this, but it is happening sometimes.
Desktop (please complete the following information):
Additional context This is a known behaviour because one can see that the power execution of TLS considers it:
if max(test_statistic_residuals) == min(test_statistic_residuals): no_transits_were_fit = True warnings.warn('No transit were fit. Try smaller "transit_depth_min"')
More than reporting a bug, which isn't, I'd like to understand what is the reason behind this implementation because I haven't successfully understood it even looking at the code for a while. I've been assuming this was caused by no transits of given durations being found. However, I don't think that should work that way in the sense that least squared fits should always be done.
Kind regards and thank you for your attention.