LemurPwned / VISM

Software for visualising magnetic layers
Other
6 stars 2 forks source link

AppCrash #125

Open JakubChecinski opened 6 years ago

JakubChecinski commented 6 years ago

Not 100% sure what caused it, maybe choosing not to normalize 3D vectors in a widget

The error message is attached as another image to: https://imgur.com/a/kHeqYxi

Edit: ok I think it was for 3D_CUBIC window actually, which could explain lack of memory error

LemurPwned commented 6 years ago

The crash is most likly due to the Text Selection bug - that also solves #124

JakubChecinski commented 6 years ago

I still get this error for 3D Cubic, even with normalization

The error message is almost identical but now also contains info about a runtime warning: https://imgur.com/j7olFzu

LemurPwned commented 6 years ago

Can you provide more info about the dataset and settings? Does it happen each time?

Runtime warnings are due to dividing by zero or very small number close to zero - these produce np.Nan values which allow later for not drawing an empty object

JakubChecinski commented 6 years ago

I checked different datasets and found an interesting thing: this error occurs only when loading full Examples->0520nm directory. If I load a single file or loading the Examples->0200nm directory, 3D Cubic widget works fine. Maybe my laptop really lacks available memory?

Settings: default 3D Cubic settings, I do not touch anything

Yes, for 0520nm directory it seems to happen every time.

LemurPwned commented 6 years ago

Does it happen for Arrow structures? This might be due to large memory usage -> 520 is a huge dataset. then each cube requries 72 veritces. Similarly, arrows also need plenty of veritces - 64 for each structure.

JakubChecinski commented 6 years ago

I tried to check for Arrow, but got bug #133 instead.

I also think it's memory, but can we provide a wrapper around it? I.e., instead of crashing the app would refuse to load a dataset that is too large and display warning? Or is it technically too difficult?

LemurPwned commented 6 years ago

(Same goes for #133 .) Wrapper is a straightforward solution. I might think about getting the data progresively loaded into ram as the animation progresses, however, there are two options:

Anyway, how about when larger averaging or decimation is selected? The general problem here is that there is a time cap set for data loading, hence TimeoutError. For example, if there was a 100 files, they are all loaded in parallel and each has a time cap of 20 seconds by deafult to be processed. I can obviously increase the limit, but it seems to me that it won't fix an issue since 20 sec is quite plenty for a file to be loaded and processed.

Weirdly enough, it seems like a purely Windows issue. I have noticed that somehow processing takes much longer here.