Open jdduh opened 8 years ago
The only way I can think of to do this is to pick a maximum number of starting polygons that we can handle and either warn or refuse to let the user proceed if they exceed this number.
Have you had a chance to ponder this? I believe this is the last outstanding issue for the BAGIS add-ins at this time.
This is a related issue - BAGIS-H crashes ArcGIS when generating HRU layers that contain too many HRU polygons. I'm not sure if there is a way to catch this error before it crashes ArcGIS. Try to generate an HRU layer using the aspect template rule, disable the filtering process, and uncheck the "allow non-contiguous HRU."
I was able to eliminate an HRU layer that contains many polygon parts if I allow the output HRUs to be non-contiguous. However, ArcGIS hung if I uncheck the Allow Non-contiguous HRU. What's difference between the codes for allowing non-contiguous and not allowing non-contiguous?
Do you think this is related? The process hung ArcGIS.
I think that last problem with the -10527 zones removed has to do with the fact that number of zones is so small initially (5). I would like to have access to this AOI so I can try to recreate.
ftp://basins.geog.pdx.edu/BAGIS/BAGIS_aois/Boise_R_nr_Twin_Springs_11022016.zip
One problem appears to be that the FeatureToRaster GP tool doesn't build the attribute table if there are a large number of rows. I don't know what the limit is. My guess is ESRI has something built into the GP tool. Maybe you want to check with them? This messes us up in later processing when we try to access the attribute table.
I'm now running the BuildRasterAttributeTable tool after the FeatureToRaster tool finishes. I set the overwrite flag to false so hopefully we don't take too big a performance hit for rasters that already have attribute tables. The other option is to check for the existence of the attribute table prior to building but I think this would be slower because we have to open the raster every time. And most times, we don't need to rebuild the table.
We only see this when disabling non-contiguous because we only run FeatureToRaster when splitting out the non-contiguous HRUs.
Also note that the retain source attributes checkbox adds a LOT of processing time if there are a large number of polygons. If NWCC doesn't need these source attributes, they may want to uncheck this for large data sets. I am still waiting for my test HRU to complete this function.
My test HRU never finished processing on Friday. I recommend picking a threshold for the maximum number of polygons. If the number of polygons exceeds this threshold, we warn the user that we will NOT retain the source attributes. Retaining the source attributes involves putting the zonal statistics table into memory and I think this is too much. I have no idea what the threshold should be. My test had about 230,000 HRUs and that was too much.
I do think the Eliminate screenshot from BoiseRiver is related. First a question though. In this case we are going from allowing non-contiguous HRU (5) to disallowing them (10,532). The equation for showing the number of zones removed subtracts the child's zone count from the parent's. In this case, the answer is invalid because we are changing how we number the HRU's. If we want to allow this use case, what calculation should we print in the No. of zones removed textbox?
This example also hung when trying to retain the source attributes so apparently 10,532 polygons are too many to copy the source attributes.
We will add a warning when the process starts if the number of polygons exceeds 5000 and . This warning will say that retaining the source attributes on a large number of polygons will cause ArcMap to become non-responsive. The user can choose to continue knowing that the source attributes won't be copied over. This means that they won't be able to track the attributes generated by the rules back to their original values. If they don't renumber the HRU's, they could look at the original layer for the attributes.
Also @jdduh question from 14 Nov is still outstanding about bad math in the No. of zones removed textbox when going from non-contiguous to contiguous HRU.
Approach #1. Just say "Unknown" in the textbox when user selects "Area of individual polygon parts" Approach #2. Change "Zones Removed" to "Polygon Removed" and report the actual number of polygons to be removed. See this post (https://geonet.esri.com/thread/33488) on how to count the number of polygons in multipart polygons. I don't think this number is important for the user to know. So, I prefer Approach #1.
Implemented warning if # of polygons > 5000 that source attributes cannot be retained. Getting the polygon count happens late in the eliminate process so it made more sense to just finish the process and warn the user rather than to abort the process.
Print "Unknown" in Number of zones removed textbox if the number is negative (going from non-contig to contig) or if user selects "Area of individual polygon parts". I posted a new version of the add-in to release 1.6.4 so it is ready for testing.
When the input HRU layer has too many polygons (say, more then 10,000), the eliminate tool could take a really, really, really, long time to complete (nobody has the patience to wait for the final results, so I don't know if the process was actually completed). I believe the problem is related to the creation of the large vector polygon file. I will do further analysis to see if there is a work-around, but we need some routines to prevent ArcMap/BAGIS-H from eternal not-responding.