Open arthur-e opened 3 years ago
Regarding the input files "input_folder
:
input_folder
parameter should be just the path to the zip files. In your case this would probably be ~/Downloads
Regarding the problem that no files are being found:
During step 3 a filter option is specified
`import yaml
with open('sample_config_file.yaml') as stream:
data = yaml.safe_load(stream)
data['input_folder'] = input_folder
data['output_folder'] = output_folder
data['gpt'] = gpt_location
data['year'] = '2021'
In this case the year 2021 is specified as a filter option. The file you mentioned above is from 2015. Change the parameter to 2015 and it should work.
After I had a look why the download via "sentinelsat" is not working I will update the notebook to be more precise about the required input parameters
Thanks, the year
was definitely an issue. I've fixed that, moved the compressed (ZIP) archive into a new folder and set that folder as the input_folder
. I'm making progress as it has found the file and the CPU is doing some work when the output start step 1
is shown.
However, it seems that Step 1 did not complete. I am getting these messages:
INFO:root:Found files within input folder: 1
INFO:root:Number of found files for year 2015: 1
INFO:root:area of interest not specified
INFO:root:Number of found files that were double processed: 0.0
INFO:root:Number of found files with border issues: 0
INFO:root:normalisation angle not specified, default value of 35 is used for processing
INFO:ComponentProgress:0
INFO:ComponentProgress:0
INFO:root:Process S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip with SNAP.
Then, after start step 1
appears:
INFO:root:1
INFO:root:skip processing for /home/arthur/Downloads/Sentinel1_SLC/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip. File does not exist
INFO:root:No valid files found for pre-processing step 2.
INFO:root:skip processing for /home/arthur/Downloads/Sentinel1_SLC/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip. File /home/arthur/Downloads/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.SAFE/processed/step2/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5_GC_RC_No_Su_Co.dim does not exist.
While it says that File does not exist
, the file does indeed exist (it is the compressed archive I just moved to that folder). I can confirm this with ls
on my system. As part of Step 1, it seems to have decompressed the archive, creating a sub-folder similar to what I would see if I manually extracted the archive. However, the only content of that subfolder (so far) is the preview
folder and the KML file for previewing the browse image. Oddly, the browse image itself was not added to this folder.
Below is the Traceback:
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
<ipython-input-10-dc6747795641> in <module>
10 processing.pre_process_step2()
11 print('start step 3')
---> 12 processing.pre_process_step3()
13 print('start add netcdf information')
14 processing.add_netcdf_information()
/usr/local/dev/sar-pre-processing/sar_pre_processing/sar_pre_processor.py in pre_process_step3(self)
337
338 # Sort file list by date (hard coded position in filename!)
--> 339 file_path, filename, file_short_name, extension = self._decompose_filename(file_list[0])
340 file_list.sort(key=lambda x: x[len(file_path) + 18:len(file_path) + 33])
341 file_list_old = file_list
IndexError: list index out of range
I know this error. This error is most likely caused by the SNAP Toolbox. There are two issues which are mentioned in 1. Requirements:
I will update the juypter notebook within the next hour
The software will create three processing folders ("step1", "step2", "step3)". After each step the output file should be written in the respective folder
Thanks, I do have libgfortran
installed so I believe it may be related to the SNAP version instead. Would you please update the README and the Documentation to specify that SNAP version 8.0.4 (or higher?) is required? I see that SNAP is listed as a dependency under "Module Requirements" in the README and also in the Documentation, but there is no version information.
It might also be helpful for users to know how to update to version 8.0.4 (I don't recall this version being available for direct download, so I assume most users will have to update their installation in the short term). For me, it was launching SNAP, and running "Help" > "Check for Updates". The download may take awhile.
Thank you for the comment. I will update the README and Documentation accordingly.
Also, SNAP has to be restarted after it finishes downloading plug-ins (it will not automatically restart). JetBeans will finish the update when the application is restarted.
I updated the jupyter notebook. The config parameters are now separated in "required" and "optional" configuration options. Originally the code was developed to pre-process time series and not single images. But as I have seen that @arthur-e tried to use just one single image I implemented a config option to also process just one image. With only one image a multi-temporal filter is not possible therefore just a single filter will be applied during the pre-processing steps.
Please reinstall the "SenSARP" package
I'm still having this issue. I have:
git pull
changes to this repo;libgfortran
is installed (gfortran is already the newest version (4:9.3.0-1ubuntu2)
);conda
;Here are the contents of my test_config_file.yaml
file:
---
gpt: /usr/local/dev/esa-snap/bin/gpt
input_folder: /home/arthur/Downloads/Sentinel1_SLC/
output_folder: /home/arthur/Downloads/Sentinel1_SLC/processed
speckle_filter:
multi_temporal:
apply: 'no'
files: '1'
And here is the output with errors; I can confirm I get the same error if I add the (correct) "year" configuration parameter to the YAML file:
INFO:root:Found files within input folder: 1
INFO:root:year not specified
INFO:root:area of interest not specified
INFO:root:Number of found files that were double processed: 0.0
INFO:root:Number of found files with border issues: 0
INFO:root:area of interest not specified, whole images will be processed
INFO:root:normalisation angle not specified, default value of 35 is used for processing
INFO:ComponentProgress:0
INFO:ComponentProgress:0
INFO:root:Process S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip with SNAP.
start step 1
INFO:root:1
INFO:root:skip processing for /home/arthur/Downloads/Sentinel1_SLC/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip. File does not exist
INFO:root:No valid files found for pre-processing step 2.
INFO:root:skip processing for /home/arthur/Downloads/Sentinel1_SLC/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip. File /home/arthur/Downloads/Sentinel1_SLC/processed/step2/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5_GC_RC_No_Su_Co.dim does not exist.
I am noticing an error in the console (from where Jupyter Notebook was launched) that may be relevant; starting with a kernel restart message:
[I 09:17:53.730 NotebookApp] Kernel restarted: 5a443a9c-75eb-493b-802d-a64353d8e4bc
[I 09:17:53.821 NotebookApp] Restoring connection for 5a443a9c-75eb-493b-802d-a64353d8e4bc:c1874a900f52437f9a1d4a35e13557ed
[I 09:17:54.830 NotebookApp] Replaying 3 buffered messages
INFO: org.esa.snap.core.gpf.operators.tooladapter.ToolAdapterIO: Initializing external tool adapters
INFO: org.esa.snap.core.util.EngineVersionCheckActivator: Please check regularly for new updates for the best SNAP experience.
Executing processing graph
INFO: org.esa.s1tbx.commons.io.ImageIOFile: Using FileCacheImageInputStream
INFO: org.hsqldb.persist.Logger: dataFileCache open start
done.
Error: [NodeId: BandMaths(2)] Could not parse expression: 'Sigma0_VH*(sin(rad(projectedLocalIncidenceAngle))/sin(rad(incidenceAngleFromEllipsoid)))'. Undefined symbol 'Sigma0_VH'.
This console error appears each time the cell with processing.create_processing_file_list()
is run.
I'll now try this for a multi-temporal dataset (executing Step 2 and downloading files first).
I was not able to reproduce this error. However, as pre-processing step 2 (co-registration of images) is not needed in the case of processing one single image, I rewrote the new implemented feature of just processing one single image by skipping pre-processing step 2. New feature:
single_file: 'no'
speckle_filter:
multi_temporal:
apply: 'yes'
will be overwritten and following statements will be given
INFO:root:Single image, no co-register of images necessary
INFO:root:multi temporal filter cannot applied to a single image, just single speckle filter is applied
The new implementation should fix the problem. Please try it out and let me know. Thank you
I'm getting a different error now--making progress! I'm getting an IndexError
on Line 546 in sar_pre_processor.py
:
file_path, filename, file_short_name, extension = self._decompose_filename(file_list[0])
It appears that file_list
is an empty list.
Above this line, there is a conditional that sets up file_list
as an empty list, starting on Line 530:
if self.file_list is None:
logging.info('no file list specified, therefore all images in output folder step2 will be processed')
file_list = self._create_file_list(self.config.output_folder_step2, '*.dim')
else:
file_list = []
...
For me, self.file_list
is a tuple with two elements:
(['/home/arthur/Downloads/Sentinel1_SLC/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip'],
[])
I'm not sure if that second empty list is another potential issue. Because self.file_list
is not None
, file_list
is initialized as an empty list. On the first pass iterating through self.file_list
there is one filename passed to self._decompose_filename()
:
# Starting on Line 537
file_path, filename, file_short_name, extension = self._decompose_filename(file)
new_file_name = os.path.join(self.config.output_folder_step2, file_short_name +
self.name_addition_step1 + self.name_addition_step2 + '.dim')
if os.path.exists(new_file_name) is True:
file_list.append(new_file_name)
else:
logging.info(f'skip processing for {file}. File {new_file_name} does not exist.')
So, new_file_name
gets created as:
'/home/arthur/Downloads/Sentinel1_SLC/step2/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5_GC_RC_No_Su_Co.dim'
The step2
folder was created at some point but the folder is empty; the file S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5_GC_RC_No_Su_Co.dim
isn't created so the check os.path.exists(new_file_name)
will always fail. Then, file_list
does not get new_file_name
appended to it, so it remains empty.
Updated version is online. Can you please report if the error still exists? Please use jupyter notebook for single image processing. Thank you.
@arthur-e : Did the updated version solve this issue? If so I would like to close this issue.
@McWhity I have not yet had a chance to evaluate these changes to the code. I have no objections to closing it; if future users experience a similar problem, they can re-open this issue.
Thanks, the
year
was definitely an issue. I've fixed that, moved the compressed (ZIP) archive into a new folder and set that folder as theinput_folder
. I'm making progress as it has found the file and the CPU is doing some work when the outputstart step 1
is shown.However, it seems that Step 1 did not complete. I am getting these messages:
INFO:root:Found files within input folder: 1 INFO:root:Number of found files for year 2015: 1 INFO:root:area of interest not specified INFO:root:Number of found files that were double processed: 0.0 INFO:root:Number of found files with border issues: 0 INFO:root:normalisation angle not specified, default value of 35 is used for processing INFO:ComponentProgress:0 INFO:ComponentProgress:0 INFO:root:Process S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip with SNAP.
Then, after
start step 1
appears:INFO:root:1 INFO:root:skip processing for /home/arthur/Downloads/Sentinel1_SLC/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip. File does not exist INFO:root:No valid files found for pre-processing step 2. INFO:root:skip processing for /home/arthur/Downloads/Sentinel1_SLC/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip. File /home/arthur/Downloads/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.SAFE/processed/step2/S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5_GC_RC_No_Su_Co.dim does not exist.
While it says that
File does not exist
, the file does indeed exist (it is the compressed archive I just moved to that folder). I can confirm this withls
on my system. As part of Step 1, it seems to have decompressed the archive, creating a sub-folder similar to what I would see if I manually extracted the archive. However, the only content of that subfolder (so far) is thepreview
folder and the KML file for previewing the browse image. Oddly, the browse image itself was not added to this folder.Below is the Traceback:
--------------------------------------------------------------------------- IndexError Traceback (most recent call last) <ipython-input-10-dc6747795641> in <module> 10 processing.pre_process_step2() 11 print('start step 3') ---> 12 processing.pre_process_step3() 13 print('start add netcdf information') 14 processing.add_netcdf_information() /usr/local/dev/sar-pre-processing/sar_pre_processing/sar_pre_processor.py in pre_process_step3(self) 337 338 # Sort file list by date (hard coded position in filename!) --> 339 file_path, filename, file_short_name, extension = self._decompose_filename(file_list[0]) 340 file_list.sort(key=lambda x: x[len(file_path) + 18:len(file_path) + 33]) 341 file_list_old = file_list IndexError: list index out of range
@McWhity Hi!
I'm trying to use the SenSARP package. But I get the exact same issue as above. I followed the advice found below:
I was not able to reproduce this error. However, as pre-processing step 2 (co-registration of images) is not needed in the case of processing one single image, I rewrote the new implemented feature of just processing one single image by skipping pre-processing step 2. New feature:
- skipping pre-processing step 2 in case of single image processing
- software can is no able to detect if just one image is given as input data. If single image is detected incorrect configuration parameters like
single_file: 'no' speckle_filter: multi_temporal: apply: 'yes'
will be overwritten and following statements will be given
INFO:root:Single image, no co-register of images necessary INFO:root:multi temporal filter cannot applied to a single image, just single speckle filter is applied
The new implementation should fix the problem. Please try it out and let me know. Thank you
But I still get the same error even if not running the step 2.
My config file looks like this now:
---
gpt: /home/User1/snap/snap/bin/gpt
input_folder: /home/User1/slc-scenes
output_folder: /home/User1/slc-result
region:
lr:
lat: 55.622278
lon: 13.301911
subset: 'yes'
ul:
lat: 55.635022
lon: 13.278522
single_file: 'yes'
speckle_filter:
multi_temporal:
apply: 'no'
files: '1'
year: 2023
I have a fresh env created with Miniconda. I have SNAP 8.0.6 installed. Would really appreciate some clues to why it does not work. I will try to debug myself while wating for an answer.
INFO:root:Found files within input folder: 1 INFO:root:Number of found files for year 2015: 1 INFO:root:area of interest not specified INFO:root:Number of found files that were double processed: 0.0 INFO:root:Number of found files with border issues: 0 INFO:root:normalisation angle not specified, default value of 35 is used for processing INFO:ComponentProgress:0 INFO:ComponentProgress:0 INFO:root:Process S1A_IW_SLC__1SSV_20150514T134812_20150514T134839_005918_0079FD_90D5.zip with SNAP.
At this stage you processed a file from 2015. But, your config file below shows that you want to process a scene from 2023.
My config file looks like this now:
--- gpt: /home/User1/snap/snap/bin/gpt input_folder: /home/User1/slc-scenes output_folder: /home/User1/slc-result region: lr: lat: 55.622278 lon: 13.301911 subset: 'yes' ul: lat: 55.635022 lon: 13.278522 single_file: 'yes' speckle_filter: multi_temporal: apply: 'no' files: '1' year: 2023
What happens if you remove the "year" option from your config file (meaning all found files should be processed)? You can also set the year to 2015.
@McWhity
Hi, I'm not trying to process images from 2015. That was text from a previous message in this thread that I quouted since the issue I have is similar.
I have the same config file as in the last post I made.
I have this code:
from sar_pre_processing.sar_pre_processor import SARPreProcessor
import warnings
warnings.filterwarnings("ignore")
processing = SARPreProcessor(config='test_config_file.yaml')
processing.create_processing_file_list()
print('start step 1')
processing.pre_process_step1()
print('start step 2')
processing.pre_process_step2()
print('start step 3')
processing.pre_process_step3()
print('start add netcdf information')
processing.add_netcdf_information()
print('start create netcdf stack')
processing.create_netcdf_stack()
My output looks like this:
INFO:root:Found files within input folder: 1
INFO:root:Number of found files for year 2023: 1
INFO:root:area of interest not specified
INFO:root:Number of found files that were double processed: 0.0
INFO:root:Number of found files with border issues: 0
INFO:root:area of interest specified
INFO:root:normalisation angle not specified, default value of 35 is used for processing
INFO:ComponentProgress:0
INFO:ComponentProgress:0
INFO:root:Process S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE.zip with SNAP.
start step 1
INFO: org.esa.snap.core.gpf.operators.tooladapter.ToolAdapterIO: Initializing external tool adapters
INFO: org.esa.snap.core.util.EngineVersionCheckActivator: Please check regularly for new updates for the best SNAP experience.
Executing processing graph
SEVERE: org.esa.s1tbx.io.sentinel1.Sentinel1Level1Directory: S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE/measurement/s1a-iw3-slc-vh-20230515t052433-20230515t052501-048540-05d6ac-003.tiff not found
SEVERE: org.esa.s1tbx.io.sentinel1.Sentinel1Level1Directory: S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE/measurement/s1a-iw3-slc-vv-20230515t052433-20230515t052501-048540-05d6ac-006.tiff not found
SEVERE: org.esa.s1tbx.io.sentinel1.Sentinel1ProductReader: Unable to load quicklook S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D
INFO: org.hsqldb.persist.Logger: dataFileCache open start
OpenSearch: https://scihub.copernicus.eu/gnss/search?q=platformname:Sentinel-1 AND platformnumber:A AND producttype:AUX_POEORB AND beginposition:[2023-05-01T00:00:000Z TO 2023-05-31T24:00:000Z]
OpenSearch: 3 total results on 1 pages.
OpenSearch: https://scihub.copernicus.eu/gnss/search?q=platformname:Sentinel-1 AND platformnumber:A AND producttype:AUX_POEORB AND beginposition:[2023-05-01T00:00:000Z TO 2023-05-31T24:00:000Z]
OpenSearch: https://scihub.copernicus.eu/gnss/search?q=platformname:Sentinel-1 AND platformnumber:A AND producttype:AUX_POEORB AND beginposition:[2023-06-01T00:00:000Z TO 2023-06-31T24:00:000Z]
OpenSearch: 0 total results on 1 pages.
WARNING: org.esa.s1tbx.orbits.gpf.ApplyOrbitFileOp: No valid orbit file found for 15-MAY-2023 05:23:31.110659
Orbit files may be downloaded from https://scihub.copernicus.eu/gnss/odata/v1/
and placed in /home/niftitech/.snap/auxdata/Orbits/Sentinel-1/POEORB/S1A/2023/05
WARNING: org.esa.s1tbx.orbits.gpf.ApplyOrbitFileOp: Using Sentinel Restituted /home/niftitech/.snap/auxdata/Orbits/Sentinel-1/RESORB/S1A/2023/05/S1A_OPER_AUX_RESORB_OPOD_20230515T084430_V20230515T045046_20230515T080816.EOF.zip instead
Error: [NodeId: TOPSAR-Deburst] java.lang.NullPointerException
INFO:root:1
INFO:root:skip processing for /my/disk/S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE.zip. File does not exist
INFO:root:No valid files found for pre-processing step 2.
INFO:root:skip processing for /my/disk/S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE.zip. File /my/disk/scenes/step1/S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE_GC_RC_No_Su.dim does not exist.
java.lang.NullPointerException
done.
start step 3
I get this error:
IndexError Traceback (most recent call last)
Cell In[12], line 14
12 processing.pre_process_step2()
13 print('start step 3')
---> 14 processing.pre_process_step3()
15 print('start add netcdf information')
16 processing.add_netcdf_information()
File [~/code/senSarp/sar_pre_processing/sar_pre_processor.py:588](https://vscode-remote+ssh-002dremote-002b192-002e168-002e0-002e128.vscode-resource.vscode-cdn.net/home/niftitech/code/senSarp/~/code/senSarp/sar_pre_processing/sar_pre_processor.py:588), in SARPreProcessor.pre_process_step3(self)
583 logging.info(
584 f'skip processing for {file}. File {new_file_name} does not exist.')
586 # Sort file list by date (hard coded position in filename!)
587 file_path, filename, file_short_name, extension = self._decompose_filename(
--> 588 file_list[0])
589 file_list.sort(
590 key=lambda x: x[len(file_path) + 18:len(file_path) + 33])
591 file_list_old = file_list
IndexError: list index out of range
Would greatly appreciate your help.
INFO:root:skip processing for /my/disk/S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE.zip. File /my/disk/scenes/step1/S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE_GC_RC_No_Su.dim does not exist.
The above line means that there is no output file after the first processing step. Thus the problem seems to occur during the first part of the processing.
SEVERE: org.esa.s1tbx.io.sentinel1.Sentinel1Level1Directory:S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE/measurement/s1a-iw3-slc-vh-20230515t052433-20230515t052501-048540-05d6ac-003.tiff not found SEVERE: org.esa.s1tbx.io.sentinel1.Sentinel1Level1Directory:S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE/measurement/s1a-iw3-slc-vv-20230515t052433-20230515t052501-048540-05d6ac-006.tiff not found
The above lines specify the error that occur during the first processing step. Some parts of the input files (downloaded zip file) seems to be missing. The error states that two burts (highlighted in bold) are missing, thus the processing fails. I just downloaded the file (S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D) from the ASF and found no missing burts.
Can you check your zip file if its corrupted? The six burts files (name of the missing two are highlighted in bold) are stored at following location "/S1A_IW_SLC__1SDV_20230515T052433_20230515T052501_048540_05D6AC_691D.SAFE/measurement/"
Error: [NodeId: TOPSAR-Deburst] java.lang.NullPointerException
This is the error expression. The error occurs during the Deburst step (combining of all burts). As stated in the previews comment it seems that two files are missing in the downloaded zip file.
@McWhity
The strange thing is, I downloaded my test data from here: https://dataspace.copernicus.eu/browser.
It downloaded just fine, but as described in my previous answer it did not work.
I then downloaded data using the guide for SenSARP by using the sentinelsat
package.
Then it worked perfectly.
Can't really see why it should matter.
I unzipped both files and compared the contents, and it looks like it is exactly the same set of files.
To the left is the one downloaded with sentinelsat
package and on the right is the one downloaded using the Copernicus browser.
Have you got any idea why it does not work with the file downloaded using Copernicus browser?
I tried it myself. And as you said it works with the file downloaded from ASF or with sentinelsat
but it does not work with the one from https://dataspace.copernicus.eu/browser. Right now I have no idea why.
Interesting, thank you for testing. Seems really strange that there should be a difference in the data.
I am running the examples in the Jupyter Notebook at
docs/notebooks/running_test_application.ipynb
and encountering some problems.In Step 3, the first three pre-processing steps are executed:
However, no files are being found, as is evident in the output:
It's not apparent what the
"input_folder"
parameter of the configuration file should be. I downloaded a Sentinel-1 SLC granule, single-pol (VV) IW mode image from the Alaska Satellite Facility DAAC (via NASA Earth Data) and after unzipping the archive, the directory looks like this:Inside the
measurement
folder are three GeoTIFF files, one for each IW swath. I tried to run this with theinput_folder
set to the archive root and then again set to themeasurement
folder but in both cases no files were found.