Closed DarthSysmon closed 7 years ago
Hi and thank you for your very thorough report. I will take a look at the issues ASAP.
There are no plans to make filegeodatabase the only choice for inputs and we try to work on that well before the final release. It is in our masterplan.
Line 400 bug relates to old code conversion and I am working on that.
T
On Wed, Feb 15, 2017 at 4:59 AM, Simon Nielsen notifications@github.com wrote:
Tested the same setup as previously, but with file geodatabases as work space and temp storage. These were kept separate as the Calculate Response tool create a lot of temp files.
I'm not sure I am keen on use file geodatabases for spatial data modelling. They are good for sorting input data, although I couldn't get the rasters organised nicely as I appeared unable to pull individual rasters out of the raster catalogues with within the file geodatabase. And there are no ways to organise tables, whereas I would prefer to be able to separate final binary weights tables from early tests usually of the cumulative ascending or categorical kind. Rasters also need to be separated into input tables and final model output tables.
I ran four rasters and let the CW toolbox point to the "working" file geodatabase but otherwise let it pick its own names. It ran the weights tables with no issues, except as I mentioned that it is not possible to create subfolders for tables within a file geodatabase.
Running Calculate Response comes with a few issues. First of all, it is here also not possible to add wieghts tables directly from the workspace. Second, output rasters cannot be directly saved to a raster catalogue within the file geodatabase. Third, the script still hangs on line 400 since two of the output fields are set to DBF format and the script is looking for rasters to get standard deviations and such from. Finally, if the output fields are left as-is, the rasters (and dbf files) go to temp and won't work.
The tool manages to make a pprb raster, which looks sound. I will test outputs of the old and new toolbox tomorrow.
Cheers, Simon CalculateResponseCrash_FGDB.txt https://github.com/gtkfi/ArcSDM/files/775933/CalculateResponseCrash_FGDB.txt [image: calculateresponse_issues_fgdb] https://cloud.githubusercontent.com/assets/25653028/22958991/b009fca8-f397-11e6-83e5-628cc1d536d7.PNG
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/gtkfi/ArcSDM/issues/13, or mute the thread https://github.com/notifications/unsubscribe-auth/AKV5eWG1L1q1klmzS4w398RoO6dhLbfdks5rcmoBgaJpZM4MBQEY .
These issues except the ones explicitly not possible (like dragging table to tool) are fixed in .13 - reopen definite ones later if they reappear.
Tested the same setup as previously, but with file geodatabases as work space and temp storage. These were kept separate as the Calculate Response tool create a lot of temp files.
I'm not sure I am keen on use file geodatabases for spatial data modelling. They are good for sorting input data, although I couldn't get the rasters organised nicely as I appeared unable to pull individual rasters out of the raster catalogues with within the file geodatabase. And there are no ways to organise tables, whereas I would prefer to be able to separate final binary weights tables from early tests usually of the cumulative ascending or categorical kind. Rasters also need to be separated into input tables and final model output tables.
I ran four rasters and let the CW toolbox point to the "working" file geodatabase but otherwise let it pick its own names. It ran the weights tables with no issues, except as I mentioned that it is not possible to create subfolders for tables within a file geodatabase.
Running Calculate Response comes with a few issues. First of all, it is here also not possible to add wieghts tables directly from the workspace. Second, output rasters cannot be directly saved to a raster catalogue within the file geodatabase. Third, the script still hangs on line 400 since two of the output fields are set to DBF format and the script is looking for rasters to get standard deviations and such from. Finally, if the output fields are left as-is, the rasters (and dbf files) go to temp and won't work.
The tool manages to make a pprb raster, which looks sound. I will test outputs of the old and new toolbox tomorrow.
Cheers, Simon CalculateResponseCrash_FGDB.txt