NeoGeographyToolkit / StereoPipeline

The NASA Ames Stereo Pipeline is a suite of automated geodesy & stereogrammetry tools designed for processing planetary imagery captured from orbiting and landed robotic explorers on other planets.
Apache License 2.0
478 stars 168 forks source link

Program is interrupted at stage 1 #360

Closed zhaomumu233 closed 2 years ago

zhaomumu233 commented 2 years ago

I used a pair of large remote sensing images to generate DEM, the version of ASP is 3.0.1, the error of "too many open files" has been fixed, and the dense matching method is SGM.

When running in stage 1, the program is interrupted and an error is reported The information is as follows: I would like to ask what might be the reason for this error and how to solve it? Thanks!

[2022-Feb-21 01:46:23] : CORRELATION FINISHED Traceback (most recent call last): File "/home/amax/anaconda3/envs/asp301/bin/parallel_stereo", line 891, in spawn_to_nodes(step, settings, self_args) File "/home/amax/anaconda3/envs/asp301/bin/parallel_stereo", line 491, in spawn_to_nodes generic_run(cmd, opt.verbose) File "/home/amax/anaconda3/envs/asp301/libexec/stereo_utils.py", line 96, in generic_run raise Exception('Failed to run: ' + cmd_str) Exception: Failed to run: parallel --will-cite --env ASP_DEPS_DIR --env PATH --env LD_LIBRARY_PATH --env PYTHONHOME -u -P 4 -a /ssdnvme/block-image/tmp59dk3l80 "/home/amax/anaconda3/envs/asp301/bin/python /home/amax/anaconda3/envs/asp301/bin/parallel_stereo block-fwd.tif block-nad.tif try-1-sgm/1 --threads-singleprocess 8 --job-size-h 2048 --job-size-w 2048 -t rpc -s SGM-stereo.default --skip-low-res-disparity-comp --processes 4 --threads-multiprocess 8 --entry-point 1 --stop-point 2 --work-dir /ssdnvme/block-image --tile-id {}"

oleg-alexandrov commented 2 years ago

You are using the conda build, which is already six months old. Maybe you can try the latest build. See the instructions here: https://github.com/NeoGeographyToolkit/StereoPipeline/releases

If is also not clear what the error is. Are there more error messages? Does it work with a different data set? Can you try to run plain stereo, so not parallel stereo, and without -s SGM-stereo.default. Maybe it will print a fuller error message.

On Monday, February 21, 2022, 赵林博 @.***> wrote:

I used a pair of large remote sensing images to generate DEM, the version of ASP is 3.0.1, the error of "too many open files" has been fixed, and the dense matching method is SGM.

When running in stage 1, the program is interrupted and an error is reported The information is as follows: I would like to ask what might be the reason for this error and how to solve it? Thanks!

[2022-Feb-21 01:46:23] : CORRELATION FINISHED Traceback (most recent call last): File "/home/amax/anaconda3/envs/asp301/bin/parallel_stereo", line 891, in spawn_to_nodes(step, settings, self_args) File "/home/amax/anaconda3/envs/asp301/bin/parallel_stereo", line 491, in spawn_to_nodes generic_run(cmd, opt.verbose) File "/home/amax/anaconda3/envs/asp301/libexec/stereo_utils.py", line 96, in generic_run raise Exception('Failed to run: ' + cmd_str) Exception: Failed to run: parallel --will-cite --env ASP_DEPS_DIR --env PATH --env LD_LIBRARY_PATH --env PYTHONHOME -u -P 4 -a /ssdnvme/block-image/tmp59dk3l80 "/home/amax/anaconda3/envs/asp301/bin/python /home/amax/anaconda3/envs/asp301/bin/parallel_stereo block-fwd.tif block-nad.tif try-1-sgm/1 --threads-singleprocess 8 --job-size-h 2048 --job-size-w 2048 -t rpc -s SGM-stereo.default --skip-low-res-disparity-comp --processes 4 --threads-multiprocess 8 --entry-point 1 --stop-point 2 --work-dir /ssdnvme/block-image --tile-id {}"

— Reply to this email directly, view it on GitHub https://github.com/NeoGeographyToolkit/StereoPipeline/issues/360, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAKDU3E6TFSIGZ5JSZYUFYDU4JKENANCNFSM5O64WRMQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

You are receiving this because you are subscribed to this thread.Message ID: @.***>

zhaomumu233 commented 2 years ago

My ASP version is a newer version installed in February 2022.

My two satellite images overlap an area of about 150 km in the east-west direction and about 60 km in the north-south direction. The geometric model of the satellite image is the RPC model.

Compared with traditional satellite images, the amount of data is larger, which also poses new challenges for 3D reconstruction of satellite images.

These days I have done further experiments with my images and the program can successfully generate DEM when I use a smaller range of images,

but when the range of images is larger, the program often fails, which I currently believe may be due to poor accuracy of the RPC model or poor alignment of the epipolar image.

In response to some ideas I have, I would like to consult some features of ASP and hope to get your answer. Some of my questions are shown below.

1.Whether epipolar images generated after the end of the Preprocessing stage generate new RPC parameters for use in subsequent triangulation stages.

  1. About the use of the bundle_adjust tool

I used the asp_bm and asp_sgm algorithms to generate the DEM and found that the edge area of the dem is warped, as "taco-shaped" described in Chapter 9 Bundle Adjustment in aspbook. So I think I need to use the bundle_adjust tool to optimize the RPC parameters.

In the previous experiment I used the stereo_gui tool to visualize A.tif, B.tif and reference.tif,

manually selected GCP on the three images and saved for subsequent Bundle Adjustment use.

However, manual selection is time-consuming, and the manual selection of points is too little compared to the larger satellite image area.

I have also used the ipfind and ipmatch tools to get the .match file between the experimental image and the reference.tif, but the iPmatch tool cannot get the common interest point of the three images

I would like to ask if it is possible to use the ipfind and ipmatch tools to match two experimental images to a reference.tif respectively.

Open the obtained two .match files with stereo_gui and generate .GCP files respectively, and use the bundle_adjust tool twice.

Finally, use bundle-adjust-prefix in parallel_stereo to pass in two camera adjustments

  1. Cannot get enough matches to align images

When I use the whole image, the program often breaks, so I want to try to use the partial image, and finally use the pc_merge tool to stitch the partial point cloud into a complete one.

I use stereo_gui to visualize two satellite images, hold ctrl and use the left mouse button to select the same area, then run parallel_stereo.

A large number of interest points are extracted in the preprocessing stage, but no matching points can be obtained.

But I used ipfind and ipmatch tools to process L-cropped.tif and R-cropoped.tif and got a lot of matching points.

I would like to ask if the matching points were eliminated because of the poor accuracy of my RPC model.

The error output from the terminal is as follows: Matching forward --->Obtained 14119 matches Matching backward --->Obtained 16405 matches Matched 0 points VW Error :unable to match left and right images.

  1. Meaning and use of corr-search parameter

Due to the larger coverage of satellite imagery, image alignment may be less effective than traditional satellite imagery, which may also lead to erroneous results.

I opened L-cropped.tif and R-cropoped.tif in ENVI, looked for the same points and compared their pixel coordinates.

It is found that the same point is not in the same line in the two epipolar images and has a large difference.

Therefore I want to modify the "corr-search" parameter to optimize the generated dem.

When I was reading the latest aspbook, the second paragraph of Section 8.2.2 gave the definition of the four numbers for "corr-search". But "horizontal maximum boundary" is defined twice, which I think is a bug.

Taking "corr-search -80 -2 20 2 " as an example, if I find that the vertical disparity of the same point on the two epipolar images is not well eliminated,

the error is about 20 pixels, and then 20 Expanded by 50% to 30. At this time, the four parameters should be modified "-80 -30 20 30". I wonder if I understand this correctly.

What I understand is that "-80 -2 20 2" constitutes a rectangular search area, The program searches the area for the same pixel.

Assuming that the alignment quality of the image is good most of the time, vertical parallax has been largely eliminated, so "maximun vertical" is set to a small number like 2 or 4. But why is "horizontal disparity" not symmetrical like "vertical disparity"?

  1. Scope of use and precautions for local_epipolar

I found that the latest asp adds a new method "local_epipolar" for image alignment. I guess if the larger images don't align well, it would be a lot better to split the image and realign it.

but I noticed that "epipolar" only works with pinhole camera models, so I want to ask if "local_epipolar" supports it RPC model

I also tried the "local_epipolar" method, from the generated "GoodPixelMap.tif", the effect is poor, there is a large area of red.

The above is my question, maybe the description in some places is not very clear, please forgive me. Looking forward to your answer, thank you very much

oleg-alexandrov commented 2 years ago

1.Whether epipolar images generated after the end of the Preprocessing stage generate new RPC parameters for use in subsequent triangulation stages.

The RPC parameters are not changed. But the "epipolar" aligned images are used from the preprocessing stage all the way to the triangulation stage. Until that stage it is only important to match the pixels, not the exact cameras. Only at the triangulation stage, after all the pixels are matched between left and right image, the software goes back to the original camera pixels, before "epipolar" alignment, and then uses your cameras to triangulate.

  1. About the use of the bundle_adjust tool

I used the asp_bm and asp_sgm algorithms to generate the DEM and found that the edge area of the dem is warped, as "taco-shaped" described in Chapter 9 Bundle Adjustment in aspbook. So I think I need to use the bundle_adjust tool to optimize the RPC parameters.

In the previous experiment I used the stereo_gui tool to visualize A.tif, B.tif and reference.tif,

manually selected GCP on the three images and saved for subsequent Bundle Adjustment use.

However, manual selection is time-consuming, and the manual selection of points is too little compared to the larger satellite image area.

I have also used the ipfind and ipmatch tools to get the .match file between the experimental image and the reference.tif, but the iPmatch tool cannot get the common interest point of the three images

I would like to ask if it is possible to use the ipfind and ipmatch tools to match two experimental images to a reference.tif respectively.

Open the obtained two .match files with stereo_gui and generate .GCP files respectively, and use the bundle_adjust tool twice.

Finally, use bundle-adjust-prefix in parallel_stereo to pass in two camera adjustments

There is no need to generate GCP files, that is indeed rather time-consuming. The bundle adjustment by itself should be enough. Then you can do alignment later, if you have a reference DEM. You can try to do:

bundle_adjust left.tif right.tif left.xml right.xml -o run_ba/run

stereo left.tif right.tif left.xml right.xml run_stereo/run \ --bundle-adjust-prefix run_ba/run

This will bundle-adjust the cameras, and use the result in stereo. It will create match files for you. It won't optimize the RPC parameters, rather, it will solve for a rotation + translation adjustment for each camera, which has a similar effect.

Here it is assumed that the cameras are in .xml files. Otherwise if they are in the images, one can run the commands without them.

You should check after this if the edge of the DEM is still warped. If it is, let me know. Note that the DEM may be in the wrong place, that is expected, because bundle_adjust may move both cameras. For now it is important that the shape of the DEM is good, rather, where it is.

When you create a DEM, you can also compute the error image, such as point2dem --errorimage run_stereo/run-PC.tif.

I will suggest examining the error image (you can use the "colormap" tool to make a color image out of it) and see if there is any tilting in it, such as errors in one area much bigger than another. Hopefully not, as then that will mean bundle adjustment failed.

If you have a reference DEM, such as Copernicus 30 m DEM (which is freely available), you can compare your obtained DEM to that one. The pc_align tool can be used to align your DEM to a reference DEM.

  1. Cannot get enough matches to align images

When I use the whole image, the program often breaks, so I want to try to use the partial image,

That should not be the case. I wonder if your images are similar enough in illumination. Otherwise stereo may not work.

If it still breaks, you can tell me the error message. I think I pushed a fix to stereo failing with the "too many open files" error. If you run into that one, you may want to get the latest build.

and finally use the pc_merge tool to stitch the partial point cloud into a complete one.

I use stereo_gui to visualize two satellite images, hold ctrl and use the left mouse button to select the same area, then run parallel_stereo.

A large number of interest points are extracted in the preprocessing stage, but no matching points can be obtained.

But I used ipfind and ipmatch tools to process L-cropped.tif and R-cropoped.tif and got a lot of matching points.

I would like to ask if the matching points were eliminated because of the poor accuracy of my RPC model.

The error output from the terminal is as follows: Matching forward --->Obtained 14119 matches Matching backward --->Obtained 16405 matches Matched 0 points VW Error :unable to match left and right images.

That is interesting. Maybe you have more luck after you use bundle adjustment, as described earlier (without manual GCP or ipfind or ipmatch).

I hope you can send the full log here, I want to understand what happens before and after that text above. Maybe indeed your RPC are not accurate. Not sure.

  1. Meaning and use of corr-search parameter

Due to the larger coverage of satellite imagery, image alignment may be less effective than traditional satellite imagery, which may also lead to erroneous results.

I opened L-cropped.tif and R-cropoped.tif in ENVI, looked for the same points and compared their pixel coordinates.

It is found that the same point is not in the same line in the two epipolar images and has a large difference.

Therefore I want to modify the "corr-search" parameter to optimize the generated dem.

When I was reading the latest aspbook, the second paragraph of Section 8.2.2 gave the definition of the four numbers for "corr-search". But "horizontal maximum boundary" is defined twice, which I think is a bug.

Taking "corr-search -80 -2 20 2 " as an example, if I find that the vertical disparity of the same point on the two epipolar images is not well eliminated,

the error is about 20 pixels, and then 20 Expanded by 50% to 30. At this time, the four parameters should be modified "-80 -30 20 30". I wonder if I understand this correctly.

What I understand is that "-80 -2 20 2" constitutes a rectangular search area, The program searches the area for the same pixel.

Assuming that the alignment quality of the image is good most of the time, vertical parallax has been largely eliminated, so "maximun vertical" is set to a small number like 2 or 4.

The L-cropped.tif and R-cropped.tif are before alignment, just with a crop. You should open L.tif and R.tif and compare them. You can also open them in stereo_gui, side by side, and if you click on a pixel, its coordinates will be printed on screen.

Yes, corr-search is a rectangular search window, of the form minx, miny widthx widthy.

But why is "horizontal disparity" not symmetrical like "vertical disparity"?

After alignment, the horizontal disparity has most of the information, while the vertical disparity should have little information, since, hopefully the alignment worked and the features mostly have a horizontal difference.

  1. Scope of use and precautions for local_epipolar

I found that the latest asp adds a new method "local_epipolar" for image alignment. I guess if the larger images don't align well, it would be a lot better to split the image and realign it.

but I noticed that "epipolar" only works with pinhole camera models, so I want to ask if "local_epipolar" supports it RPC model

I also tried the "local_epipolar" method, from the generated "GoodPixelMap.tif", the effect is poor, there is a large area of red.

Normally local_epipolar should work. I used it myself recently and it creates good results. But I don't know what your data is.

It is not clear to me if your images are difficult, or if your use of tools is the issue. I don't know if you can share pictures with your attempts. That may help. It would be interesting to see a screenshot of stereo_gui with left and right regions selected with Control-mouse, then the result of running parallel_stereo from the menu and point2dem, so the DEM itself. It would be interesting to also see the error image, obtained as above with point2dem --errorimage, and also what you mean by "warped DEM".

The above is my question, maybe the description in some places is not very clear, please forgive me. Looking forward to your answer, thank you very much

It takes a lot of effort to get comfortable with the tools, and using them with big images can be difficult. Your questions are very welcome.

I am not sure how clear my own answers are. You can try to see if anything helps. Then let me know how it goes and we'll see what to do next.

zhaomumu233 commented 2 years ago

First of all, thank you very much for your answer and guidance last time. According to your suggestion, I did some further experiments to answer your question.

The current conjecture is that with the increase of the image width, the ability of the rational function model to fit the rigorous sensor model gradually decreases, which may lead to ASP processing errors and the generation of DEMs with poor quality and accuracy.

I checked the L.tif and R.tif generated by ASP, and extracted the second band of F.tif, the vertical disparity band. I found the vertical disparity to be large and not well removed.

  1. Reply to last question

I do bundle adjustment on two images and output the error image. The red area has the largest triangulation error, about 1000 meters. It looks like bundle adjustment has failed image

The warping of dem can be considered as the elevation or descent of the terrain. I displayed my terrain in 3D, and it can be found that the ground on the right side has risen up, but his real terrain is a plain, so there should be no big fluctuations. image

  1. Question about point feature matching

When the preprocessing stage is over, I use the GUI to load the .vwip and .match files to see the matching points. I found that although the image has an evenly distributed and sufficient number of interest points, there are no matching points in the upper right corner of the image.

When I read the asp documentation, I found the parameter "ip-triangulation-max-error","epipolar-threshold". Is it because the error is too large or the epipolar line is poor and the matching point in the upper right corner of the picture is deleted?

The images are similar enough in illumination. It can be found from the figure below that in the area with matching points, the number of matching points is sufficient and evenly distributed. This also shows that the similarity of the images is better. The image on the left is the .match situation loaded in the GUI, and the image on the right is an enlargement of the upper right corner of the image.

image image

  1. Weird GoodPixelMap.tif

When I use the local_epipolar alignment method and the dense matching algorithm is SGM, asp sometimes fails to found interest points, and then fails to match, and the terminal log is as follows:


_Using algorithm: 1 --> Reading global interest points.


But when choosing some small regions to run local_epipolar, sometimes the effect is not very good, as can be seen from GoodPixelMap.tif. The bottom left part of each tile succeeds, but everything else fails. The situation is shown in the figure below. image

This situation of GoodPixelMap.tif also sometimes happens when my alignment method is affineepipolar, the periphery of the image can be matched successfully, but the center area fails, as shown in the figure below.

image

  1. pc_align tool

I use the pc_align tool to align the dem results of relative bundle adjustment to the open source dem, but the generated results have three bands. Normally, DEMs have one band to store elevation values, but why are there three here? Below is the command I entered

pc_align open-dem.tif sgm-DEM.tif \ -o align/1 --alignment-method similarity-least-squares --save-transformed-source-point

  1. processes and threads

When processing images, we need to set the processes and threads values used. But sometimes I'm not very clear about the difference between the two, so sometimes I just set the value of threads. Below is some of my understanding, please correct me if I am wrong.

I often use the SGM algorithm, so I set the ''corr-memory-limit-mb'' parameter. If my threads value is 10 and corr-memory-limit-mb is 4000, does it mean that the maximum memory usage is 104=40G. But if I set processes to 2, threads to 10, and corr-memory-limit-mb to 4000, the maximum memory usage at this time is 210*4=80G.

When I use SGM algorithm, threads is set to 24, processes are not set. When I use the MGM algorithm, because it uses up to 8 threads, I set processes to 3 and threads to 8. I would like to ask if the processing speed in these two cases is roughly the same?

  1. too many open files

The asp version I installed was the version released on March 8, 2022, but the problem of "too many open files" still occurred during the filtering phase. I noticed that this bug has been fixed in the latest version, but I have this problem again.

      Error:GdalIO: getdem/1-49600_36800_1600_1600-RD.tif:  Too many open files (code = 4)

The above are my thoughts and experimental results in the recent period of time, and I hope to continue to receive your guidance. Thank you very much~

oleg-alexandrov commented 2 years ago

The current conjecture is that with the increase of the image width, the ability of the rational function model to fit the rigorous sensor model gradually decreases, which may lead to ASP processing errors and the generation of DEMs with poor quality and accuracy.

That makes sense.

  1. Question about point feature matching

When the preprocessing stage is over, I use the GUI to load the .vwip and .match files to see the matching points. I found that although the image has an evenly distributed and sufficient number of interest points, there are no matching points in the upper right corner of the image.

When I read the asp documentation, I found the parameter "ip-triangulation-max-error","epipolar-threshold". Is it because the error is too large or the epipolar line is poor and the matching point in the upper right corner of the picture is deleted?

Could be. You can try to run stereo with --skip-rough-homography and --no-datum, and see if that brings back those missing points. With these options the camera information is not used, which in your case appear not so accurate.

  1. Weird GoodPixelMap.tif

  2. pc_align tool

I use the pc_align tool to align the dem results of relative bundle adjustment to the open source dem, but the generated results have three bands. Normally, DEMs have one band to store elevation values, but why are there three here?

The result of pc_align is a point cloud. If you want a DEM, you should use point2dem to create it. (https://stereopipeline.readthedocs.io/en/latest/tools/pc_align.html#output-point-clouds-and-convergence-history)

  1. processes and threads

When processing images, we need to set the processes and threads values used. But sometimes I'm not very clear about the difference between the two, so sometimes I just set the value of threads. Below is some of my understanding, please correct me if I am wrong.

I often use the SGM algorithm, so I set the ''corr-memory-limit-mb'' parameter. If my threads value is 10 and corr-memory-limit-mb is 4000, does it mean that the maximum memory usage is 10_4=40G. But if I set processes to 2, threads to 10, and corr-memory-limit-mb to 4000, the maximum memory usage at this time is 2_10*4=80G.

I think the memory usage does not depend on the number of threads, only on the number of processes. So, if you have 16 cores, it is better to use 2 processes with 8 threads each (8 threads is at most what MGM supports).

  1. too many open files

The asp version I installed was the version released on March 8, 2022, but the problem of "too many open files" still occurred during the filtering phase. I noticed that this bug has been fixed in the latest version, but I have this problem again.

      Error:GdalIO: getdem/1-49600_36800_1600_1600-RD.tif:  Too many open files (code = 4)

The above are my thoughts and experimental results in the recent period of time, and I hope to continue to receive your guidance. Thank you very much~

The file above is a little tile. Your output directory should have an RD.tif file which is just a text file, which, inside, should have a list of such little tiles. I will suggest opening it and taking a look. It should start like this:

VRTDataset rasterXSize="688" rasterYSize="1204"

Now, I hope you can test if it has further down lines like this:

SourceProperties RasterXSize="688" RasterYSize="180"

This is what my fix put in. If you don't have such lines, that means your build is without the fix. If you have such text, maybe there's a bug somewhere which I'd have to figure out how to reproduce. Other users told me that after my fix the problem no longer shows up.

oleg-alexandrov commented 2 years ago

I made a sample example showing how to run our software with a dataset having RPC cameras. It is at https://github.com/NeoGeographyToolkit/StereoPipelineSolvedExamples/releases. Maybe experimenting with that one a little may offer some insights into what is going on with your own dataset.

oleg-alexandrov commented 2 years ago

I will also suggest when running point2dem to use the option --errorimage, and then to colorize and examine that image. Normally the errors should be no more than your image resolution in meters/pixel, or comparable. If they are big or the errors are concentrated in one area, that means your RPC model is wrong. Not sure if bundle adjustment can help.

A reference DEM can be found here, https://portal.opentopography.org/raster?opentopoID=OTSDEM.032021.4326.3, which can also help understand if the issue is your data, or how you use the tools.

zhaomumu233 commented 2 years ago

1. Added --skip-rough-homography and --no-datum to hopefully improve results

In the preprocessing stage, I added the --skip-rough-homography and --no-datum parameters, but the resultant matching points became worse, and the matching points only exist in some areas, as shown in the following figure. image

  1. too many open files

I opened the RD.tif file and the beginning of the file was

, but I didn't find SourceProperties RasterXSize="***" RasterYSize="***". Yesterday I deleted the old version of ASP on computer A and installed version 20220308. After the installation is complete, I package the new environment and migrate it to computer B. Then I run it again on computer B, "too many open files" still appears, and there is still no "SourceProperties RasterXSize RasterYSize" in RD.tif. Does this happen because of the influence of the old version of ASP on computer B? I am confused as to why the new version cannot be installed correctly. 3.IntersectionErr.tif Without using the bundle_adjustment tool, the IntersectionErr.tif generated by using --errorimage is as you said. The error is large and concentrated in a certain area. The resolution of my DEM is about 3m per pixel, but the error value sometimes reaches hundreds of meters. Therefore, RFM is not suitable for larger format satellite imagery. I also compared with the public DEM, the elevation errors in some areas range from tens of meters to hundreds of meters, which is obviously wrong. But without taking the elevation error into account, the plane position of my DEM doesn't deviate too much, and the terrain shape is the same as the public DEM. Is this because in the dense matching stage, the SGM algorithm performed better and found most of the correct matching points? The triangulation error is described in the ASP documentation as "the distance between two rays, is not the true accuracy of the DEM. It is only another indirect measure of quality" I would like to know how ASP calculates the distance between two rays. 4 ASTER Example I learned the example of how to run asp with a dataset having RPC cameras. And I think this kind of picture is simple, intuitive and beautiful. Can I ask what software is used to open the DEM. I also want to use this picture to show the results. ![image](https://user-images.githubusercontent.com/67795058/159023087-3b0b21c1-853f-4230-9e6b-98caea9a9351.png) 5. Create my own camera model Since the RFM model is no longer applicable to a large area, the original rigorous sensor model is a good choice. Is it possible to call my own camera model by modifying the code, for example -t mymodel. It's difficult for me to understand and modify the source code at the moment, but I want to know if the idea is possible? Finally, I would like to express my sincere thanks again for your continued help~
oleg-alexandrov commented 2 years ago

Since ASP worked for you on the ASTER example I provided, and since your own dataset, as you say, has a large intersection error, that means that your camera model is indeed not accurate enough.

In the preprocessing stage, I added the --skip-rough-homography and --no-datum parameters, but the resultant matching points became worse, and the matching points only exist in some areas, as shown in the following figure. image

I am running out of ideas. ASP really doesn't like something about your dataset. If you can share the data with me privately I could take a closer look (I know data sharing may not be possible for various reasons).

Yesterday I deleted the old version of ASP on computer A and installed version 20220308. After the installation is complete, I package the new environment and migrate it to computer B. Then I run it again on computer B, "too many open files" still appears, and there is still no "SourceProperties RasterXSize RasterYSize" in RD.tif.

Can you do parallel_stereo --version? When I do on mine, I get:

NASA Ames Stereo Pipeline 3.0.1-alpha Build ID: c9f68ad7 Build date: 2022-03-10

Also, you can go to the directory where ASP is, and look up the file libexec/parallel_stereo (not bin/parallel_stereo). Open it, and search for this text:

https://github.com/NeoGeographyToolkit/StereoPipeline/blob/master/src/asp/Tools/parallel_stereo#L340

Does this happen because of the influence of the old version of ASP on computer B? I am confused as to why the new version cannot be installed correctly.

I don't know. I would not think so.

I also compared with the public DEM, the elevation errors in some areas range from tens of meters to hundreds of meters, which is obviously wrong. But without taking the elevation error into account, the plane position of my DEM doesn't deviate too much, and the terrain shape is the same as the public DEM. Is this because in the dense matching stage, the SGM algorithm performed better and found most of the correct matching points?

I guess that is good. It is still not good when that error is too high.

The triangulation error is described in the ASP documentation as "the distance between two rays, is not the true accuracy of the DEM. It is only another indirect measure of quality" I would like to know how ASP calculates the distance between two rays.

Two rays are traced, and if they don't intersect, the closest segment between the rays are found. The intersection error is the length of that segment. If the intersection error is high it means that your rays which are meant to intersect at the ground point are really off, which is a bad thing.

I learned the example of how to run asp with a dataset having RPC cameras. And I think this kind of picture is simple, intuitive and beautiful. Can I ask what software is used to open the DEM.

Try stereo_gui --hillshade run/run-DEM.tif. Also see https://stereopipeline.readthedocs.io/en/latest/next_steps.html#generating-color-hillshade-maps

  1. Create my own camera model

Since the RFM model is no longer applicable to a large area, the original rigorous sensor model is a good choice. Is it possible to call my own camera model by modifying the code, for example -t mymodel. It's difficult for me to understand and modify the source code at the moment, but I want to know if the idea is possible?

Creating your own camera model is a giant pain. If your sensor is at least a frame camera, rather than linescan (also called pushbroom) it may be easier. But for linescan there's a good amount of code to write.

zhaomumu233 commented 2 years ago
  1. Personally, I really want to share data with you to overcome these doubts, but due to some regulations I can't do so, I'm very sorry, please forgive me.

  2. parallel_stereo –version

When I enter "parallel_stereo --version" in the terminal, the displayed information is as follows

NASA Ames Stereo Pipeline 3.0.0 Build ID: 2d554034

Built against: NASA Vision Workbench 3.0.0 Build ID: 90486189 USGS ISIS 5.0.1 Boost C++ Libraries 106800 GDAL 2.4.1 | 20190315

I find the installation path of ASP "/home/admin/anaconda3/envs/aspnew3/libexec" and open libexec folder, but I can not find the parallel_stereo . image

  1. I also used coloramp to create some beautiful images before, but my legends don't have labels like yours, they only have colors, are these labels added by myself later? It would be better if the coloramp tool could also add the map scale directly.

Finally, I wish you a good mood

oleg-alexandrov commented 2 years ago

I do know data sharing is impractical for anything except public datasets.

Your build is old, from Jul 27, 2021. You may want to set carefully the PATH variable to point to the new build or even wipe or hide the old one.

ASP's tools cannot create labels, map scales, etc. The images in the doc I shared with you were likely done with some regular viewer. One can use QGIS for that which is free.

The bundle_adjust tool can reduce the intersection errors, or at least make them better distributed. (I don't recall if you used it before.) See https://stereopipeline.readthedocs.io/en/latest/tools/bundle_adjust.html. Then stereo needs to be called with --bundle-adjust-prefix as specified there, and one can do point2dem --errorimage on new PC cloud to see the intersection error.

Note that bundle_adjust creates its own match files, and you can examine if those are any better. Bundle adjustment also filters those matches for outliers. (Stereo later can also use the bundle_adjust match files if they are copied over to the stereo directory before starting.)

oleg-alexandrov commented 2 years ago

To add to the previous one:

I find the installation path of ASP "/home/admin/anaconda3/envs/aspnew3/libexec" and open libexec folder, but I can not find the parallel_stereo .

I see. You installed ASP with conda, so parallel_stereo should be in the bin directory. But in either case, you have the old build. The daily build is not shipped with conda. You will need to get the tarball from https://github.com/NeoGeographyToolkit/StereoPipeline/releases and extract it somewhere in your user dir and set the path to it.

zhaomumu233 commented 2 years ago
  1. I have used bundle_adjust tool before. Then use --errorimage to generate intersection-Err-Map. And compare it with intersection-Err-Map generated without bundle_adjust tool. The error in some areas is indeed smaller, but the effect is very limited, and the area to be optimized is very small. In addition, I also compared two DEMs, and the improvement in accuracy is also small.

    For smaller areas of my satellite imagery data, I also selected images with a width of about 70km to generate a DEM. This time the triangulation error has become smaller. Compared with the public DEM, the elevation is similar and the accuracy is better.

For the case where the matching points could not be generated before, I think it is also due to the problem of RFM accuracy. As I said before, the parameter "ip-triangulation-max-error" eliminates those matching points, resulting in no matching points in some areas.

Therefore, for satellite images with large width, I think the accuracy of my RFM model is too poor, maybe the first thing I should do is to improve the precision of RPC parameters, and then pass it into asp to get the DEM.

  1. My asp was installed using conda, I found parallel_stereo in my bin folder, I opened it, but I didn't find "Bugfix for the "too many open files" problem." I think I understand you eans, daily build cannot be updated using conda.

But I'm not quite sure about the update method you're talking about.

Maybe you mean to unzip "StereoPipeline-3.0.1-alpha-2022-03-18-x86_64-Linux.tar.bz2" and put it in the ASP installation folder such as "/home/zhao/anaconda3/envs /asp". Replace the old files with the new unzipped installation package files.

You say "set the path to it". I don't know how to proceed with this, can you give me more detailed guidance?

Thank you~

oleg-alexandrov commented 2 years ago

You can unzip the ASP .tar.bz2 file anywhere in your home directory and then run /home/your/location/asp/bin/parallel_stereo --version to confirm the version, then you can continue using the full path to this and other programs instead of your other installation.

On Sun, Mar 20, 2022, 8:02 AM 赵林博 @.***> wrote:

  1. I have used bundle_adjust tool before. Then use --errorimage to generate intersection-Err-Map. And compare it with intersection-Err-Map generated without bundle_adjust tool. The error in some areas is indeed smaller, but the effect is very limited, and the area to be optimized is very small. In addition, I also compared two DEMs, and the improvement in accuracy is also small.

For smaller areas of my satellite imagery data, I also selected images with a width of about 70km to generate a DEM. This time the triangulation error has become smaller. Compared with the public DEM, the elevation is similar and the accuracy is better.

For the case where the matching points could not be generated before, I think it is also due to the problem of RFM accuracy. As I said before, the parameter "ip-triangulation-max-error" eliminates those matching points, resulting in no matching points in some areas.

Therefore, for satellite images with large width, I think the accuracy of my RFM model is too poor, maybe the first thing I should do is to improve the precision of RPC parameters, and then pass it into asp to get the DEM.

  1. My asp was installed using conda, I found parallel_stereo in my bin folder, I opened it, but I didn't find "Bugfix for the "too many open files" problem." I think I understand you eans, daily build cannot be updated using conda.

But I'm not quite sure about the update method you're talking about.

Maybe you mean to unzip "StereoPipeline-3.0.1-alpha-2022-03-18-x86_64-Linux.tar.bz2" and put it in the ASP installation folder such as "/home/zhao/anaconda3/envs /asp". Replace the old files with the new unzipped installation package files.

You say "set the path to it". I don't know how to proceed with this, can you give me more detailed guidance?

Thank you~

— Reply to this email directly, view it on GitHub https://github.com/NeoGeographyToolkit/StereoPipeline/issues/360#issuecomment-1073269786, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAKDU3HUO3EEMJA7UAELYLLVA44Y7ANCNFSM5O64WRMQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

You are receiving this because you commented.Message ID: @.***>

oleg-alexandrov commented 2 years ago

To add to the previous message, if your problem is indeed that the RPC model is only locally accurate, but results in warping if used globally, one may try to process things in tiles. In each tile, which could be maybe 5 km by 5 km, with 1 km overlap among the tiles, one could replace their RPC model with ours for that tile, with the tool cam2rpc. Then, the left and right image crops for that tile with their local rpc camera files could be bundle adjusted individually (the param --camera-weight can be increased if the resulting DEM goes too far), see if the intersection error at least for a tile goes down, then run stereo for that tile, get a DEM, and align that one to a third party global DEM using our pc_align tool. Then, if those look consistent (which can be verified with the geodiff tool), those could be merged with dem_mosaic.

This would be a very desperate measure though, which would take a lot of time, and I am not sure about the chances of success. The ideal approach is to use the exact model. We support that only for DigitalGlobe, PeruSat, and SPOT5, and as before, adding support for a new model is tricky.

zhaomumu233 commented 2 years ago

1 Thanks for your suggestion, Based on your suggestion, the current strategy is to divide a large satellite image into several sub-images and generate corresponding RPC parameters for each sub-image. The generated sub-DEMs were merged using the dem_mosaic tool.

According to this idea, I also did some experiments, and the generated results are all within the acceptable range. The intersection error and the elevation difference with the public DEM are much smaller than before.

2 I unzip the ASP .tar.bz2 file in my home directory, Then use the parallel_stereo in the bin folder directly, this time "too many open files" does not appear again, and the DEM is successfully generated. After many twists and turns, I finally overcome this difficulty, I am very happy.

Finally, please allow me to express my thanks again for your help all the time. Many of your suggestions have given me great inspiration and helped me solve many difficulties and doubts. Hope ASP gets better and better~

oleg-alexandrov commented 2 years ago

Glad there is progress, and I understand that things take time and it is hard to figure out the best way of handling issues.

Not sure you did the alignment step (with our pc_align) before comparing your little DEMs to the pubic one and before mosaicking, but if you say things are acceptable, then maybe it is ok without alignment.

zhaomumu233 commented 2 years ago

I used other tools to align the DEM with the public DEM. The following figure is the error distribution of the two DEMs, the ordinate is the number of pixels, and the abscissa is the elevation difference. It can be seen that the error is mainly concentrated in 16 meters. 配准后高程误差图

I also used the pc_align tool to align the DEM, and only performed the "translation" transformation. The calculated transfrom matrix is the identity matrix. Can it be considered that the plane accuracy is good?

Considering that my dem resolution is 3 meters and the public dem resolution is 30m, I don't know if the error can be considered acceptable.

Thank you~

oleg-alexandrov commented 2 years ago

If the public DEM grid resolution is 30 m that does not mean the vertical resolution is 30 m, it is likely a few factors better than that and may depend on slope. You may want to consult that public DEM's info for more details.

The mostly 16 m vertical accuracy seems to be a little high to me. But I am not sure. Our 'geodiff' tool can be used to colorize the absolute difference, and the colormap and DEMs can be overlaid and hillshaded in the GUI to see where the errors are and if there are any obvious horizontal or vertical shifts or other blunders. Or, if you have more than one image set at 3 m resolution, one could try to create DEMs from those too and see how they all compare.

On Tue, Mar 22, 2022 at 7:48 AM 赵林博 @.***> wrote:

I used other tools to align the DEM with the public DEM. The following figure is the error distribution of the two DEMs, the ordinate is the number of pixels, and the abscissa is the elevation difference. It can be seen that the error is mainly concentrated in 16 meters. [image: 配准后高程误差图] https://user-images.githubusercontent.com/67795058/159509344-89a9dd71-9e3c-45a6-b7a9-b1501499f54e.jpg

I also used the pc_align tool to align the DEM, and only performed the "translation" transformation. The calculated transfrom matrix is the identity matrix. Can it be considered that the plane accuracy is good?

Considering that my dem resolution is 3 meters and the public dem resolution is 30m, I don't know if the error can be considered acceptable.

Thank you~

— Reply to this email directly, view it on GitHub https://github.com/NeoGeographyToolkit/StereoPipeline/issues/360#issuecomment-1075277803, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAKDU3FYIMEMNQIWHU72EV3VBHMT3ANCNFSM5O64WRMQ . You are receiving this because you commented.Message ID: @.***>

zhaomumu233 commented 2 years ago

The image coverage area I chose was a mountainous area with an average elevation of 1500 meters. First, use the geodiff tool to calculate the absolute elevation error between the DEM and the public DEM. The error value is limited to the range of 0-50 meters.The error distribution is as follows. image image The error in the red area in the lower right corner of the image is large, which may be caused by the low fitting accuracy of RPC in this area.

The following are some local detail pictures, put the error map together with the public DEM to make an image swipe. It can be found that the line with larger error is consistent with the valley line. The upper half of the image is the elevation error map, and the lower half is the public DEM.

image image

Considering that the test area is mountainous, and most of the lines with large errors are consistent with the valley lines, is this accuracy within an acceptable range?

Thank you very much~

oleg-alexandrov commented 2 years ago

To me this looks like there's a problem somewhere. The error is too big and it has a clear pattern.

Normally, if you have one low-resolution DEM and another high resolution DEM, and take their difference, the error is mostly small except at corners of steeper areas.

But it is hard to say for sure just by looking at pictures. And if there's a big error, it is hard to say if it comes from lack of alignment or from poor cameras.

Note that point2dem can be used to make a DEM at 30 m too, not just at your native 3 m resolution, by changing the --tr parameter. Then one could compare datasets at same resolution. Then, as before, if you have another image pair, then another DEM can be created and used for comparison.

So I don't know what to say for sure.