NeoGeographyToolkit / StereoPipeline

The NASA Ames Stereo Pipeline is a suite of automated geodesy & stereogrammetry tools designed for processing planetary imagery captured from orbiting and landed robotic explorers on other planets.
Apache License 2.0
478 stars 168 forks source link

dem generation using aster #361

Closed xinluo2018 closed 2 years ago

xinluo2018 commented 2 years ago

Hi, recently I use asp tool for generating dem data by using aster stereo images, I find the aster image mostly contaminated by the cloud, so I want to know does the cloud will affect the dem generation by using asp tool, or I have to filter the cloud-contaminated aster images mannully? Thanks.

oleg-alexandrov commented 2 years ago

Stereo will work poorly with clouds. You'll have to find images which mostly lack clouds. Also having similar illumination.

I am not sure what you mean by "filter images". All you can do is select good images. Maybe that's what you mean. I don't think there's a good way to filter the clouds from the images.

xinluo2018 commented 2 years ago

Thank you for you response, what my meaning is: I have to select the good images manually as the asp algorithm input. If I have many images (corresponding to different years) to process, it will be a little laborious…

On Mar 15, 2022, at 12:10 PM, Oleg Alexandrov @.***> wrote:

Stereo will work poorly with clouds. You'll have to find images which mostly lack clouds. Also having similar illumination.

I am not sure what you mean by "filter images". All you can do is select good images. Maybe that's what you mean. I don't think there's a good way to filter the clouds from the images.

— Reply to this email directly, view it on GitHub https://github.com/NeoGeographyToolkit/StereoPipeline/issues/361#issuecomment-1067544001, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKTWPYZLJZ2YWSSKIHI64TLVAAEVBANCNFSM5QXK3MEQ. Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub. You are receiving this because you authored the thread.

oleg-alexandrov commented 2 years ago

The key issue here is that you won't get good results if your inputs have too much cloud cover. Those can hide the true surface of the ground, so then there's no good info to use.

Some vendors specify in the metadata what the cloud cover percentage is. Maybe ASTER does too, I am not sure.

ASP can handle some clouds, but sometimes its runtime can go through the roof. If you have a testcase which is mostly good but which gets confused by a cloud or two, let me know, I am now chasing such a problem in a different context and may have some suggestions.

The short story is that you have to throw out bad inputs. I think the ASTER search interface at https://search.earthdata.nasa.gov/search offers thumbnail previews, which can help you rule out bad datasets (if I recall right). If you have a thousand image pairs, it may take several hours to manually inspect the thumbnails. Not fun but doable.

Maybe it is also possible to get an idea of what the pixel values of the clouds is. Normally clouds are white so they show up very bright. Then some kind of simple algorithm using OpenCV can help identify how much area is cloud, so one could automate things that way.

xinluo2018 commented 2 years ago

Hi, I have used asp tool for generating dem data, however, I found the obtained dem result exists much data missing, do you know what’s the reason for it? The attached is the obtained dem data. Thanks.

On Mar 15, 2022, at 1:44 PM, Oleg Alexandrov @.***> wrote:

The key issue here is that you won't get good results if your inputs have too much cloud cover. Those can hide the true surface of the ground, so then there's no good info to use.

Some vendors specify in the metadata what the cloud cover percentage is. Maybe ASTER does too, I am not sure.

ASP can handle some clouds, but sometimes its runtime can go through the roof. If you have a testcase which is mostly good but which gets confused by a cloud or two, let me know, I am now chasing such a problem in a different context and may have some suggestions.

The short story is that you have to throw out bad inputs. I think the ASTER search interface at https://search.earthdata.nasa.gov/search https://search.earthdata.nasa.gov/search offers thumbnail previews, which can help you rule out bad datasets (if I recall right). If you have a thousand image pairs, it may take several hours to manually inspect the thumbnails. Not fun but doable.

Maybe it is also possible to get an idea of what the pixel values of the clouds is. Normally clouds are white so they show up very bright. Then some kind of simple algorithm using OpenCV can help identify how much area is cloud, so one could automate things that way.

— Reply to this email directly, view it on GitHub https://github.com/NeoGeographyToolkit/StereoPipeline/issues/361#issuecomment-1067586489, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKTWPY2PXFMIE7LN5DO5PV3VAAPUHANCNFSM5QXK3MEQ. Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub. You are receiving this because you authored the thread.

oleg-alexandrov commented 2 years ago

Your attachment did not arrive. Maybe you can visit the issue on GitHub and attach a picture there. I will suggest looking at your inputs. Maybe their quality is not so good or the illumination is too different. Or you can try another dataset.

oleg-alexandrov commented 2 years ago

I made a sample ASTER example together with instructions with how to run it at: https://github.com/NeoGeographyToolkit/StereoPipelineSolvedExamples/releases

Maybe that could help understand why in your case the quality is not so good.

dshean commented 2 years ago

Nice!

Also worth mentioning this older ASTER example (implemented by @ShashankBice in a notebook with rendered figures): https://github.com/uw-cryo/asp-binder-demo/blob/master/example-aster_on_pangeo_binder_draft.ipynb.

Still hoping we can secure some funding to improve examples like this at some point in the future. @oleg-alexandrov - another option is to stage the example larger sample datasets on Zenodo, rather than github release. But whatever works!

oleg-alexandrov commented 2 years ago

I added that to the doc too. I may try to figure out Zenodo at some point, for now the GitHub release interface seems to be reasonably convenient.

I plan to slowly add more solved examples as I find time. That will make it much easier to get started especially for planetary data, where if one uses the recently adopted CSM cameras a self-contained example can be made while avoiding hundreds of GB of supporting kernel data and extra ISIS software for processing it.

xinluo2018 commented 2 years ago

this is the input and result pair images, maybe the data missing is caused by the poor image quality? while the input images also are carefully selected and compared to other aster images, I think they seem better already.

Screen Shot 2022-03-17 at 10 45 41 AM
xinluo2018 commented 2 years ago

I made a sample ASTER example together with instructions with how to run it at: https://github.com/NeoGeographyToolkit/StereoPipelineSolvedExamples/releases

Maybe that could help understand why in your case the quality is not so good.

Thanks, I will try this example.

xinluo2018 commented 2 years ago

Nice!

Also worth mentioning this older ASTER example (implemented by @ShashankBice in a notebook with rendered figures): https://github.com/uw-cryo/asp-binder-demo/blob/master/example-aster_on_pangeo_binder_draft.ipynb.

Still hoping we can secure some funding to improve examples like this at some point in the future. @oleg-alexandrov - another option is to stage the example larger sample datasets on Zenodo, rather than github release. But whatever works!

thanks, what i want to do is to produce dem data in a local Tibet region by using asp tool. and I have referred to some of your related papers.

oleg-alexandrov commented 2 years ago

Thank you for sharing the images. The left and right one seem to have a lot of clouds. I am not sure why some data is missing in the center image. Maybe the snow there is featureless. You can zoom in to that center dataset and see what is going on. See also https://stereopipeline.readthedocs.io/en/latest/tutorial.html#dealing-with-terrain-lacking-large-scale-features.

Hopefully as you play more with the tools you will see where it does well and where its limitations are. Ideally stereo wants to have clean data with many features it can match from left to right images.

xinluo2018 commented 2 years ago

The command mapproject may have some problems. if I want to project the parsed run-Band3N to WGS84, the true extent of my region is [-122.349, -121.349, 46.708, 47.341], when I use the command mapproject, however, the result extent is [237.653, 238.640, 46.725, 47.335]. I think there would be some problems with the mapproject script and therefore report to you.

oleg-alexandrov commented 2 years ago

You should try to project onto a DEM, for example, from here: https://portal.opentopography.org/raster?opentopoID=OTSDEM.032021.4326.3 or here: https://portal.opentopography.org/raster?opentopoID=OTSRTM.082015.4326.1.

But I think even in that case you may get a 360 degree offset like now. Normally that is not a problem, I believe.

I checked that if you specify in mapproject:

--t_projwin -122.349 46.708 -121.349 47.341

you will likely get the right answer. Note that the values there are not in the same order as what you wrote above.

I will have to check why mapproject adds a 360 degree offset.

xinluo2018 commented 2 years ago

You should try to project onto a DEM, for example, from here: https://portal.opentopography.org/raster?opentopoID=OTSDEM.032021.4326.3 or here: https://portal.opentopography.org/raster?opentopoID=OTSRTM.082015.4326.1.

But I think even in that case you may get a 360 degree offset like now. Normally that is not a problem, I believe.

I checked that if you specify in mapproject:

--t_projwin -122.349 46.708 -121.349 47.341

you will likely get the right answer. Note that the values there are not in the same order as what you wrote above.

I will have to check why mapproject adds a 360 degree offset.

hi, it means all the longitude obtained by mapproject should be adds a -360 degree offset, or only the longitude larger than 180 should be add a -360 offset. if you are avaiable, could you fix the maproject that the obtained longitude range within [-180, 180], because in most time the geographic image seems with the lognitude of [-180,180]. Thanks.

oleg-alexandrov commented 2 years ago

I looked into this 360 degree offset issue. Both Google Maps and OpenTopography use [-180, 0] for the longitude in the Western hemisphere. Overall, I did not see anybody use a longitude like 240 degrees. It would be either -120 E or 120 W.

So, your suggestion that the mapprojected images in the Western Hemisphere should use a negative longitude rather than a longitude in [180, 360] makes sense.

Unfortunately, both our mapproject and DEM creation tool use the following convention. If the longitude is within [-90, 90] degrees, then it is kept there, but if it is outside of this, it is shifted so that it is in [0, 360]. For example, a Florida DEM would have the longitude in [-180, 180], but a California DEM in [0, 360].

This behavior has been around for more than 10 years, and nobody was unhappy about it, likely because a GIS tool should have no issues handling both.

I am reluctant to change this now, as it would break too many of our tests, and I am still not sure what is the proper thing and what is assumed for other planets (ASP is used heavily for the Moon, etc).

I will keep this issue in mind. In the meantime, maybe you can use the --t_projwin trick I suggested before.

xinluo2018 commented 2 years ago

got it with thanks.

xinluo2018 commented 2 years ago

so to be consistent with the other tools, like qgis, I process the data (e.g., dem generation) on the utm coordinate system as much as possible, that is, the longitude issue could be avoided, I think it could be a solution.

oleg-alexandrov commented 2 years ago

Does QGIS fail to load or interpret correctly a DEM produced with longitude > 180? Yeah, UTM should work.

xinluo2018 commented 2 years ago

I just check it again and it is ok in QGIS, maybe something I am not handling correctly before. Thank you again.

xinluo2018 commented 2 years ago

it is just I want to get the srtm dem of the same region of the aster image, however, the dem download API just supports the longitude within [-180, 180], so I have to transform the longitude of the aster image which is obtained by using mapproject tool. And for the utm zone calculating, I also have to adjust the common algorithm somewhat. Besides these, that is ok, and gradually, I become familiar with this not common setting.

xinluo2018 commented 2 years ago

hi, I have another question, in the dem result folder, during the dem generation, a IntersectionErr.tif file was also generated. I don't know what the IntersectionErr.tif is, is the data in the IntersectionErr.tif stand for the error of the generated dem data, and does a higher value in the IntersectionErr.tif stand for lower quality of the generated dem data? in general, which the error value (threshold) can be accepted for the generated dem data. Thank you.

oleg-alexandrov commented 2 years ago

it is just I want to get the srtm dem of the same region of the aster image, however, the dem download API just supports the longitude within [-180, 180], so I have to transform the longitude of the aster image which is obtained by using mapproject tool.

Whether a DEM's extent should be in [0, 360] or [-180, 180] is indeed a valid point. Our approach is kind of nonstandard indeed.

For now I added an option to our image_calc tool which allows one to apply a given longitude shift to a GeoTiff, so you can fix it this way after it is produced. This will be in the daily build on the GitHub releases page maybe tomorrow. The manual page and an example are here: https://stereopipeline.readthedocs.io/en/latest/tools/image_calc.html.

in the dem result folder, during the dem generation, a IntersectionErr.tif file was also generated. I don't know what the IntersectionErr.tif is, is the data in the IntersectionErr.tif stand for the error of the generated dem data, and does a higher value in the IntersectionErr.tif stand for lower quality of the generated dem data? in general, which the error value (threshold) can be accepted for the generated dem data.

The intersection error is the closest distance between rays intersecting on the ground. Values with errors above a certain threshold are already filtered out automatically with the point2dem option --remove-outliers-params before the DEM is even created.

Yes, very high values in this error usually means your given DEM grid point is not good. However, it is hard to say that a little higher values stand for a little lower quality DEM. If your cameras are not perfectly consistent, so there's some unknown error in camera orientation, that will result with higher intersection error, but the DEM likely is still rather good. Note that bundle_adjust can be used to reduce the intersection error.

Also, the intersection error can vary somewhat due to artifacts in the images or the camera model which computes the rays.

So, the intersection error is something important, and you should examine it, but is not a direct measure of DEM quality.

We provide a tool called corr_eval, https://stereopipeline.readthedocs.io/en/latest/tools/corr_eval.html, which I am still working on, which can help with evaluating DEM quality. It shows how strong the correlation is between a left image pixel and corresponding right image pixel (for now the tool is slow but I will speed it up soon). A higher correlation means likely the pixel is more reliable as its left image neighborhood is more similar to the determined right image neighborhood. But there is no clear threshold here either for what a reliable pixel is.

xinluo2018 commented 2 years ago

Thanks for your response, and I will try it. I am just conducting a study about monitoring the ice melting (in a local Tibetan plateau region) over several tens of years, I hope I could gain some new findings by using aster stereo-based dem data.

oleg-alexandrov commented 2 years ago

Ice volume change sounds like fun. You will need very tight registration of the rock part, of course, to be able to measure accurately what changed in the ice. Our tools have been used in the past with such work with good success, but I think with WorldView data which are likely bigger in footprint and higher in resolution.

Since there was mention of issues with clouds earlier in this thread, I've just made some improvements to the tools and docs for this case and put a writeup here: https://stereopipeline.readthedocs.io/en/latest/tutorial.html#dealing-with-clouds. The hope is that after mapprojection one can have the ground mostly in the right location and then the clouds, being higher and variable, are easier to isolate.

xinluo2018 commented 2 years ago

I will try it by following your kind suggestions. In fact, the aster stereo has been successfully used for dynamic monitoring ice volume change in some cases, such as: https://www.nature.com/articles/s41586-021-03436-z, this paper performs dem generation through an improved method (as the author says). since the richer features of the asp tool and elaborated document, I think the asp tool also can be used for fine dem information extraction from aster images, and then in support of ice volume change monitoring. moreover, the asp tool seems applicable for other satellites, such as worldview, and it makes it convenient for many other specific study cases by using asp tool. Nevertheless, before that, I have to become familiar with the asp tool first.

xinluo2018 commented 2 years ago

hi, I found a problem about the longitude_offset in image_calc tool, that is, all the longitude will be performed a 360 offset, however, in fact, only the longitude >180 needs offset, if an offset conduct on a longitude which is < 180, e.g., 120, the transformed longitude will be 120-360=-240, which out of the range [-180, 180], may be it could be better if an automatic judgment for whether the offset is needed to be added to the --longitude-offset script.

oleg-alexandrov commented 2 years ago

Yes, the --longitude-offset option I put in image_calc will just apply any longitude shift the user wants, not even a multiple of 360.

The hope was that it would be useful for DEMs in the Western Hemisphere, if the user does not like their longitudes to be in [180, 360]. It does not make sense to use it for DEMs in Asia, for example, or close to the 0 or 180 meridians.

So, yes, some user judgement is needed. In the same way the user should use judgement to convert to a stereographic projection closer to poles. There is really no way to fully automate these choices.

oleg-alexandrov commented 2 years ago

To add, you may want to write some little python tool maybe, which peeks at a DEM, and if it does not like its longitude range, calls the program I wrote to offset it. Speaking of tools, given that you plan to do large scale processing, you may want to look at David's collection of scripts here, https://github.com/dshean, since at some point ASP is just a set of programs, and one would need Python and its ability to call GDAL directly to be able to put together some automated workflows.

xinluo2018 commented 2 years ago

I have done the automated conversion from [0, 360] to [-180, 180] by some scripts, now the conversion is only carried out on the longitude > 180, that is, the longitude within [180, 360] is converted to [-180, 0], and the longitude within [0, 180] remains. Thank you.

xinluo2018 commented 2 years ago

hi, I have completed the parameters setting for dem generation by using asp tool, if you are available, could you help me check my setting files, which are located at: https://github.com/xinluo2018/Glacier-in-RGI1305/blob/main/script/stereo.default. I want to use aster stereo images for the time-series dem generation, and then apply the generated dem for ice melting monitoring and glacier mass loss evaluation. Thank you.

adehecq commented 2 years ago

Hi xinluo2018, Since you're interested in glacier applications, could you not leverage the dataset provided with the paper of Hugonnet et al. 2021? The glacier elevation change is provided worldwide and for 5 year time periods at 100 m resolution. You can see it here zoomed over West Kunlun: here: http://maps.theia-land.fr/theia-cartographic-layers.html?year=2000-2020&month=09&collection=glaciers&zoom=10&lat=35.325&lng=80.996 I know that depending on your exact application, the data set might not have the optimal spatial or temporal resolution, but that would save you a lot of time, rather than reprocessing data that has been processed before. If the data set is not quite sufficient, you could contact Romain Hugonnet (@rhugonnet) who might be able to send you directly the processed ASTER DEMs over your area of interest. Similarly, David Shean (@dshean) also processed the whole ASTER archive in this area. Better not re-invent the wheel !

oleg-alexandrov commented 2 years ago

Those are good suggestions above from people who actually use the tools and do work.

When it comes to parameters, I will first recommend mapprojecting the images. Which I think you are doing since you chose alignment-method none. For the option ip-filter-using-dem you may need quotes, like this: "srtm_utm_tmp.tif 100". This is a new option. I tested it quite a bit, I hope you can experiment with it to see which value is best there. Note that srtm DEMs are normally relative to geoid, so there's a vertical offset relative to what ASP would create, which would be relative to WGS84, but maybe you got your SRTM DEM from the OpenCartography web site, when it is relative to WGS84 already.

I am not sure what to say about corr-kernel. If 15 15 works well for you, that's good.

xinluo2018 commented 2 years ago

thank you for your kindly sharing. the data provided by Hugonnet et al. could not meet our requirements. I will compare the dem produced by ourselves and produced by Hugonnet et al. accordingly, we will further validate the conclusion made by Hugonnet et al. or lead to some new or more detailed findings corresponding to our study region. hopefully, our processing workflow will obtain more accurate aster-derived dem data, therefore, our workflow will be applied to larger-scale regions and longer time-series glacier monitoring, and lead to more new findings. however, the premise to conduct that is we could produce the accurate dem by ourselves. since we can use the powerful open-source asp tool, we think it is not difficult, and the main workload is to fine-tune the parameters in the dem generation with expert experience. actually, I think the parameters fine-tuning may be deserved if consider the future further study.

xinluo2018 commented 2 years ago

thank you for your kind suggestions, the aster stereo images used for dem generation have been projected to utm in our workflow. the ip-filter-using-dem is used for the simple outlier removal in our experiments, and I found it works, if we are not set this parameter, the result dem has many outliers that is, the elevation values are below 0 or above 9000. We will pay attention to the vertical offset as you mentioned. the corr-kernel which is set to 15 15 seems will be more time-consuming, thus through referring to https://github.com/FannyBrun/ASTER_DEM_from_L1A/blob/master/stereo.default.MikeWillisInt, we will modify it to 25 25 or 21 21 by default, and modify the subpixel-kernel to 35 35. then we will apply the fully automatic workflow to the west Kunlun mountain region, about 1000 aster stereo images, for dem generation during 2000-2020.

rhugonnet commented 2 years ago

@xinluo2018 My take on this: for nearly two decades, a large body of research in glaciology and other fields have been aiming to improve ASTER DEMs, using for instance ASP and MicMac, and improving the related bias corrections. If you don't intend to develop new advanced methods (for example, if you just change to ASP for stereo calculations with the MikeWillis setting in the repo of Fanny Brun), I am afraid that you won't produce DEMs of better quality than those with the specific bias corrections developed in Girod et al. (2017), and available worldwide over glaciers in Hugonnet et al. (2021). We tested many photogrammetric softwares and products over many years, and developed specific processing chains including that with ASP and the MikeWillisInt parameters (Brun et al. (2017), Dussaillant et al. (2019), Menounos et al. (2019)), and that with MicMac and many bias-corrections methods, before generating DEMs from 30 TB of data.

I would advise to contact Fanny Brun or David Shean for their DEMs if you prefer ASP, or me for MicMac. There is a tiny bit of cloud filtering issues in the Kunlun mountains, but actually those artefacts are mostly filtered in the hypsometric filtering and gap-filling.

xinluo2018 commented 2 years ago

hi, @rhugonnet, I am glad to see your kind reply here. do you still remember I ever contact you before, because I want to reappear your study published on Nature journal by using the code https://github.com/rhugonnet/ww_tvol_study. I have tried but failed finally. so I began to try a new tool to produce dem, and very luckily, I produced dem finally. since I have completed all the workflow scripts by using asp tool, so I prefer use asp tool to continue my work currently. Actually, I have downloaded your released data of the West Kunlun region, however, it seems that you just released the elevation change rate data, not the time-series dem data? if available, could you send me the time-series dem data on this region? I will consider taking your dem data as a reference or a cross validation to our generated dem. do you think it is any necessity? I just want to study a method that could produce acceptable time-series dem data, which will benefit my future study. Lastly, thank you for your kindly advice again, I will carefully consider your advice, and I look forward to more communication with you.

xinluo2018 commented 2 years ago

thank you for your kindly sharing. the data provided by Hugonnet et al. could not meet our requirements. I will compare the dem produced by ourselves and produced by Hugonnet et al. accordingly, we will further validate the conclusion made by Hugonnet et al. or lead to some new or more detailed findings corresponding to our study region. hopefully, our processing workflow will obtain more accurate aster-derived dem data, therefore, our workflow will be applied to larger-scale regions and longer time-series glacier monitoring, and lead to more new findings. however, the premise to conduct that is we could produce the accurate dem by ourselves. since we can use the powerful open-source asp tool, we think it is not difficult, and the main workload is to fine-tune the parameters in the dem generation with expert experience. actually, I think the parameters fine-tuning may be deserved if consider the future further study.

@adehecq thank you for your kind advice, i just want to have a try to produce dem data by using the accessible aster dem.

rhugonnet commented 2 years ago

@xinluo2018 To be brutally honest, given what is available from the above mentioned studies, I don't see the value in re-generating the same ASTER DEMs, in the same place, and with the same tools, whatsoever. There is only added value if further improvements are made to those tools, and it does not seem to be the focus here. The data of Brun et al. (2017), Shean et al. (2019) and Hugonnet et al. (2021) that all cover you study region are available upon request owing to their large size (Github is not the channel to ask for data, however). I don't have any more comments on this, sorry!

xinluo2018 commented 2 years ago

no need to say sorry @rhugonnet, just feel free to give any comments or not, and no one requires that the responses have to be returned. whatever thanks.

dshean commented 2 years ago

FYI, here is what @pahbs used to process ASTER for HMA back in ~2019: https://github.com/pahbs/aster_dem. The options for ASP have evolved since then, but you can see the general config and approach. The resulting DEMs were co-registered and filtered using the methods outlined in the paper and tools here: https://github.com/dshean/hma_mb_paper.

dshean commented 2 years ago

I agree with @rhugonnet about leveraging past efforts and datasets whenever possible. But I can also appreciate the value of the training/education that comes with learning the tools, and the value of developing reproducible, end-to-end workflows using lower-level data products as inputs.

It seems unlikely that reprocessing large portions of the ASTER archive with ASP will offer significantly improved DEM products. The MMASTER corrections are valuable, and this functionality doesn't currently exist in ASP. We've spent a lot of time thinking about jitter corrections and there are some approaches implemented in ASP, but no robust solution for ASTER at present (would be a nice contribution if you want to take this on!). And then there is the nontrivial task of filtering, co-registering, and combining the DEMs to extract the signals of interest - for noisy ASTER DEMs, it will be challenging to improve upon the methodology outlined in Hugonnet et al.

Personally, I believe there is always value in attempting to reproduce and improve upon past methods and results. For me, it's often a question of prioritization, as there are too many important things to work on and never enough time :) Good luck @xinluo2018, whatever you decide to do.

xinluo2018 commented 2 years ago

The processing script and data which you shared with me are helpful. Thanks for your sharing. @dshean

xinluo2018 commented 1 year ago

Thank you for you response. The aster data provides cloud cover percentage in the metadata, however, it is not accurate. I use aster for monitoring glacier region, and I think the inaccuracy is may because the glacier is highly similar with cloud (both very bright) in my region.

I will do more tests and found how the cloudy image affects the dem quality, may be some pre-processing and post-processing could be required for improving the dem generation by using asp tool...

Thank you.

Best regards Xin Luo

On Mar 15, 2022, at 1:44 PM, Oleg Alexandrov @.***> wrote:

The key issue here is that you won't get good results if your inputs have too much cloud cover. Those can hide the true surface of the ground, so then there's no good info to use.

Some vendors specify in the metadata what the cloud cover percentage is. Maybe ASTER does too, I am not sure.

ASP can handle some clouds, but sometimes its runtime can go through the roof. If you have a testcase which is mostly good but which gets confused by a cloud or two, let me know, I am now chasing such a problem in a different context and may have some suggestions.

The short story is that you have to throw out bad inputs. I think the ASTER search interface at https://search.earthdata.nasa.gov/search https://search.earthdata.nasa.gov/search offers thumbnail previews, which can help you rule out bad datasets (if I recall right). If you have a thousand image pairs, it may take several hours to manually inspect the thumbnails. Not fun but doable.

Maybe it is also possible to get an idea of what the pixel values of the clouds is. Normally clouds are white so they show up very bright. Then some kind of simple algorithm using OpenCV can help identify how much area is cloud, so one could automate things that way.

— Reply to this email directly, view it on GitHub https://github.com/NeoGeographyToolkit/StereoPipeline/issues/361#issuecomment-1067586489, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKTWPY2PXFMIE7LN5DO5PV3VAAPUHANCNFSM5QXK3MEQ. Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub. You are receiving this because you authored the thread.