Open rafaqz opened 6 years ago
That definitely looks like a bug... but hard to pinpoint without sample code.
Can you build a simple testcase that causes this issue?
You could generate sample data using something like:
x=collect(1:2000)
y=rand(length(x))
Are you using multiple y for the same x? Are you using 2-dimensional x vectors? 2-dimensional y vectors?
...?
Just a single 1 dimensional vector.
I can't duplicate the issue with random data, but its reliable with my model output - which has a lot of very small numbers around zero in it, but other than that I have no idea whats going on.
Thanks for that tidbit. It might be a bug with the algorithm for "F1 acceleration".
You can disable F1 acceleration by adding the following to your ~/.juliarc
file:
DEFAULTS_INSPECTDR = Dict(
:droppoints => :never,
)
If you no longer get a bounds error, that is likely the cause.
...Sadly, disabling F1 acceleration reduces the plotting speed somewhat - especially with large datasets.
Yep that fixed it, and slowed it down a bit...
Great. That's good to know.
The problem is in the _reduce()
function in src/datasets.jl.
The F1 acceleration algorithm is known not to be rock solid (see comment):
#FIXME: improve algorithm
#Algorithm succeptible to cumulative errors (esp w large x-values & large xres_max):
I am not exactly certain how to make this more robust yet.
On hack you could try is to increase the size of the padding at the "end" of the undersampled vector:
sz = min(n_ds, xres_max)+1[INCREASE THIS NUMBER]+2*min_lookahead #Allow extra points in case
Try increasing the value of 1 in this line to something larger... but I still don't feel this to be a real solution to the problem - just a band-aid.
@rafaqz: BTW, I know disabling F1 acceleration slows things down significantly for large datasets (easily 2-10x for GBs of data)... but is it still faster than GR/plotly? Just curious.
Update:
Just added a failsafe that will fall back to plotting with naive solution if code for F1 acceleration crashes (no need for premature de-optimization with :droppoints => :never
).
Go ahead and try a Pkg.update()
to see if the failsafe is ok: You should now get a warning message, but still get a (slowly generated) plot when the algorithm fails.
I'm getting this error with some series lengths of large arrays, using InspectDR as a Plots.jl backend. It doesn't occur with other backends, and seems to be occuring in InspectDR internals - my arrays are much larger than 1007 elements.