Open cparcerisas opened 2 months ago
@cparcerisas, my vote would be 1) with the stated caveat that it is due to limitations in the binder environment.
Strategy 3) is ultimately the more robust one(no particular environment needed). The kernel may be dying because of memory limitations, but I need to understand the processing code in more detail to say for sure. Closing any open datasets will force the memory to be free.
@danellecline yes I played around with it, you can see it here: https://github.com/ioos/soundcoop/blob/pypam/binder_docs/2_analysis_of_HMD_pypam/data_analysis_with_pypam.ipynb Functions load_data_station_slow and load_data_station
@carriecwall @carueda @danellecline @KarinaKh
So I have been playing around with options on how to deal with the large amount of data we need for the pypam and the env notebooks. I have several proposals, but I got a bit stuck on some of them. I list them here:
Let me know the preferred solution, or if anyone has any possible improvements/suggestions for any of the options