johntruckenbrodt / pyroSAR

framework for large-scale SAR satellite data processing
MIT License
512 stars 112 forks source link

'NoneType' object has no attribute 'text' with removeS1BorderNoiseMethod='pyroSAR' and old S-1 datasets #298

Closed griembauer closed 7 months ago

griembauer commented 7 months ago

- if applicable, which version of SNAP or GAMMA are you using in pyroSAR?
SNAP 9
- the full error message
I am processing a long time series of S-1 data and between ~ 04/2018 and today the processing as described above worked well. However, for scenes in the first quarter of 2018 (and I assume before that as well), `snap.geocode(infile=s1_file_id, **kwargs)` does not throw an error, but does not start the `gpt` processing either (as visible via `htop`). Instead, a log file is created which only contains 

'NoneType' object has no attribute 'text'



The process graph that is produced by `snap.geocode` can be successfully run using `gpt` on the command line without errors, but it is the call of `snap.gpt` from within `snap.geocode` that makes the difference: Using `removeS1BorderNoiseMethod='ESA'` as an argument in `snap.gpt`/`snap.geocode` solves the issue. 

From the documentation I see that the IPF version makes a difference, but if I understand correctly, the default option `removeS1BorderNoiseMethod='pyroSAR'` should work in any case? 
johntruckenbrodt commented 7 months ago

Hi @griembauer thanks for reaching out and reporting this. The linked PR should fix it.
I was looking at your parameterization (I replaced the t_srs value with the EPSG code):

kwargs = {
    't_srs': 2157,
    'returnWF': True,
    'terrainFlattening': True,
    'refarea': 'gamma0',
    'export_extra': ['layoverShadowMask'],
    'outdir': '/data/s1_results_timeline_extension/tests',
    'speckleFilter': None,
    'spacing': 20.0,
    'externalDEMFile': '/data/dem/ireland_dem_20m_int.tif',
    'externalDEMNoDataValue': None,
    'externalDEMApplyEGM': True,
    'alignToStandardGrid': True,
    'demResamplingMethod': 'BILINEAR_INTERPOLATION',
    'imgResamplingMethod': 'BILINEAR_INTERPOLATION',
    'standardGridOriginX': 400900.0,
    'standardGridOriginY': 1002740.0,
    'clean_edges': True,
    'polarizations': ['VV', 'VH'],
    'scaling': 'dB',
    'groupsize': 999,
    'gpt_args': ['-x', '-c', '8G', '-q', '1'],
    'tmpdir': '/data/temp_dir_calculations',
    'shapefile': '/data/grassdata/loc_2157/PERMANENT/.tmp/b64d9297e23a/1025.0.geojson'
}

I highly recommend increasing the number of cores and the memory with gpt_args. What you have defined is likely not enough for processing. Furthermore, splitting the workflow via groupsize helps a lot for requiring fewer resources.

griembauer commented 7 months ago

Great, thanks a lot for the quick help and recommendations! For the limited AOI I have, the configuration works well, but I guess I am a bit too cautious to give SNAP/GPT a lot of resources ;) I will experiment with multiple cores and the groupsize parameter!

johntruckenbrodt commented 7 months ago

Sure. If it works well for your case it's all good. I usually do something like this to control memory, cache (75% memory), and cores:

['-J-Xmx32G', '-c', '24G', '-q', '32']

I'll release this fix with the next version in the coming days.