Open agmunozs opened 9 months ago
the incomplete answer is that you need to create this to store your credentials. @eorland can you document this in pfdf/README.md?
@thomasastanley Yes I can. @agmunozs we once had the tutorial for the token set up in the old README
which still needs to be updated (my fault). Downloading any data from FIRMS programmatically requires a token associated with your EarthData account. This is thankfully a very easy step. All instructions for creating a token are at this link: https://nrt4.modaps.eosdis.nasa.gov/help/downloads. Skip down to the section titled "Download Tokens". Once generated, create a generic file called token.txt
, in the ref_data
directory.
Edit: Please note that you will also need to supply authentication credentials to access HLS imagery for burn severity mapping. The instructions for which are located in make_netrc.py
.
Let us know if you still have issues.
Hi, both,
Thanks a lot. I wasn’t able to come back to this issue until today.
Now it advances, but got another error:
python pfdf/scripts/model_run.py --path pfdf Opening basin data Opened basin data Downloading FIRMS FIRMS downloaded
No run data file found! The data from this run will be saved to a new file.
...more stuff… and then what it might be interesting to solve the problem:
snpp_df = pd.concat([read_function(file,bbox) for file in snpp_paths[-date_range:]],ignore_index=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/lhasa/lib/python3.11/site-packages/pandas/core/reshape/concat.py", line 380, in concat op = _Concatenator( ^^^^^^^^^^^^^^ File "/opt/homebrew/anaconda3/envs/lhasa/lib/python3.11/site-packages/pandas/core/reshape/concat.py", line 443, in init objs, keys = self._clean_keys_and_objs(objs, keys) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/anaconda3/envs/lhasa/lib/python3.11/site-packages/pandas/core/reshape/concat.py", line 505, in _clean_keys_and_objs raise ValueError("No objects to concatenate”)
Any suggestions?
Thanks,
Ángel G. Muñoz, PhD (he/him)
Senior Researcher | Ramón y Cajal Fellow Co-Leader - Horizon Europe Climateurope2 Project https://climateurope2.eu/Leader - Climate Services Team (CST) Earth System Services https://ess.bsc.es/ Group | Earth Sciences Department Barcelona Supercomputing Center - Centro Nacional de Supercomputación (BSC-CNS) Address: C/ Jordi Girona, 31, 08034 Barcelona, Spain | Torre Girona, 2nd Floor Personal Website: Ángel G. Muñoz site https://www.bsc.es/munoz-solorzano-angel-garikoitzEmail: @.*** Twitter: @agmunozs https://twitter.com/agmunozs?lang=es
On 18 Jan 2024, at 17:03, eorland @.***> wrote:
Let us know if you still have issues.
— Reply to this email directly, view it on GitHub https://github.com/nasa/LHASA/issues/20#issuecomment-1898769224, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGRZTBSDZ2V2QVKUPDRH3YLYPFBV5AVCNFSM6AAAAABCAN44MOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQOJYG43DSMRSGQ. You are receiving this because you were mentioned.
No run data file found! The data from this run will be saved to a new file.
This part just means it got to that point in the script for the first time. It's actually good news.
Thanks, Thomas.
Right, yes. I saw the code and thought the same, but then I have no idea about the next part…
Also, this might not be the best place to talk about it, but I wanted to know if there’s already a way for the landslide model (not the pfdf one) to point to reading local files rather than downloading/opendaping recent forecasts. This can be useful for predictability analysis for more than 2 days during the hindcast period and, in my particular interest, to explore forcing the model with subseasonal predictions (if you’re interested in a collaboration, let me know). —yes, I know I’d need the p99 values coming from the subseasonal models for the calibration/normalisation in lhasav2, but I’m not worried about that.
Cheers,
Ángel G. Muñoz, PhD (he/him)
Senior Researcher | Ramón y Cajal Fellow Co-Leader - Horizon Europe Climateurope2 Project https://climateurope2.eu/Leader - Climate Services Team (CST) Earth System Services https://ess.bsc.es/ Group | Earth Sciences Department Barcelona Supercomputing Center - Centro Nacional de Supercomputación (BSC-CNS) Address: C/ Jordi Girona, 31, 08034 Barcelona, Spain | Torre Girona, 2nd Floor Personal Website: Ángel G. Muñoz site https://www.bsc.es/munoz-solorzano-angel-garikoitzEmail: @.*** Twitter: @agmunozs https://twitter.com/agmunozs?lang=es
On 22 Jan 2024, at 16:28, Thomas Stanley @.***> wrote:
No run data file found! The data from this run will be saved to a new file.
This part just means it got to that point in the script for the first time. It's actually good news.
— Reply to this email directly, view it on GitHub https://github.com/nasa/LHASA/issues/20#issuecomment-1904244679, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGRZTBRN4RNS3ZFYA7D7SCTYP2ARFAVCNFSM6AAAAABCAN44MOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMBUGI2DINRXHE. You are receiving this because you were mentioned.
Also, this might not be the best place to talk about it, but I wanted to know if there’s already a way for the landslide model (not the pfdf one) to point to reading local files rather than downloading/opendaping recent forecasts. This can be useful for predictability analysis for more than 2 days during the hindcast period and, in my particular interest, to explore forcing the model with subseasonal predictions (if you’re interested in a collaboration, let me know). —yes, I know I’d need the p99 values coming from the subseasonal models for the calibration/normalisation in lhasav2, but I’m not worried about that.
Hi,
I'm trying to run the pfdf model but it complains of not having a token.txt file.
Any advice?
Output: python pfdf/scripts/model_run.py --path pfdf Opening basin data Opened basin data Downloading FIRMS Traceback (most recent call last): File "/Volumes/Models/LNDSLIDES/Sibila_lndslides_v1/pfdf/scripts/model_run.py", line 358, in
workflow(bbox)
File "/Volumes/Models/LNDSLIDES/Sibila_lndslides_v1/pfdf/scripts/model_run.py", line 88, in workflow
firms_request.download_firms(
File "/Volumes/Models/LNDSLIDES/Sibila_lndslides_v1/pfdf/scripts/firms_request.py", line 24, in download_firms
with open(token_path) as f:
^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'pfdf/ref_data/token.txt'