scottcha / OpenAvalancheProject

Open source project to bring data and ml to avalanche forecasting
MIT License
83 stars 39 forks source link

Notebooks are not working, repo is too untidy #41

Closed Hsgngr closed 3 years ago

Hsgngr commented 4 years ago

Hi scottcha, we would like to contribute the project however we cannot run the notebooks. Can you do some documentation please:

Especially for beginning the project, the contribution threshold is really high since we don't have any documentation.

Where are all these data:

processed_path = '/media/scottcha/E1/Data/OAPMLData/GFSTrainingData/GFSFiltered/' path_to_labels = '/media/scottcha/E1/Data/OAPMLData/CleanedForecastsNWAC_CAIC_UAC.V1.2013-2020.csv' temp_path = '/media/scottcha/Data/Temp/' features_path = '/media/scottcha/Data/Temp/Features/' y_path = '/media/scottcha/E1/Data/OAPMLData/ExtractedTSFresh/Results/' final_path = '/media/scottcha/E1/Temp/Final/'

scottcha commented 4 years ago

Ha, yeah your right! You caught me in the middle of a major refactor. I chose to overwrite master prematurely since it was very out of date and I was concerned folks were using that and honestly I didn't feel it was a good basis for future work. The current cleanup is happening in https://github.com/scottcha/OpenAvalancheProject/tree/clean-notebooks and I'm trying to get it to a state where one can use the current state as is. I'm hoping to get to that point in the next few days if you can hold until then. At that point it should be much easier to contribute.

Hsgngr commented 4 years ago

Thank you so much, if anything we can help let me know.

scottcha commented 4 years ago

I don't consider this closed but I did just pull in a large update to the notebooks which also includes a tutorial in the main Readme. I'm going to take another pass through as well as complete the ML notebook updates before resolving this.

scottcha commented 3 years ago

Closing this with the latest PR as there have been several updates to address this issue:

  1. The notebooks have been cleaned and operationalized
  2. The datapipeline notebooks have moved to nbdev and now output importable modules and are no longer intended to be used directly.
  3. Tutorial added to main readme and minimized to leveraging the datapipeline modules