I have tidied up the code so that we only have one API call. This prevents us from going over the daily/hourly limit every time we run the notebook. I have then processed all the data into a clean dataframe from every day between now and 1940 and for every single country.
Next steps for me include exporting it as a database for more efficient data storage and querying when doing the visualisations.
I have tidied up the code so that we only have one API call. This prevents us from going over the daily/hourly limit every time we run the notebook. I have then processed all the data into a clean dataframe from every day between now and 1940 and for every single country.
Next steps for me include exporting it as a database for more efficient data storage and querying when doing the visualisations.