SpaceGroupUCL / qgisSpaceSyntaxToolkit

Space Syntax Toolkit for QGIS
GNU General Public License v3.0
115 stars 40 forks source link

Reconstructing model from depthmapxnet temporary files #208

Open joaoponceleao opened 7 months ago

joaoponceleao commented 7 months ago

Hi,

We just completed a segment analysis of NYC that took a long time. This was done with source files on PostGIS and target destination on PostGIS. It produced no errors. However, it produced an empty table in PostGIS (with the correct fields), and a temporary layer in QGIS (with the correct fields but only 6 features).

The temporary files generated by depthmapxnet (analyse-result.txt, etc) have been correctly generated with all necessary data. I can import convert these into a segment map through the ref-id. I can also calculate the final post-processing steps (which from my understanding only produce the NAIN, NACH?) in postgres or perhaps the field calculator. I've done this before using one of the equations for this, but would be good to know which ones SpaceSyntaxToolkit uses.

My question is. Is this an acceptable method to reconstitute the final layer? Or is there something else I am missing? @jorgegil any thoughts? Please help.

I will test later with smaller files to see what could be producing this issue (essentially, it looks like everything worked but the results were not inserted into either temporary layer or postgis).

joaoponceleao commented 7 months ago

With the segment length as a weight I was able to reconstitute the model with the analysis_result.txt id key and the source layer seg_id. I realise there are other ways to do this. I've also noticed x1,y1 and x2,y2 coordinates in the result txt file can also be used to reconstruct. Having used this toolkit quite a bit in the past, I've noticed these issues only seem to occur during long analysis (i.e. 24h +) and not necessarily related to the size of the network.

For consistency, will use the depthmap formula to calculate remaining measures, as per #128

I'll leave this issue open in case any of the devs wants to respond. Feel free to close.

jorgegil commented 7 months ago

@joaoponceleao Thanks for pointing out this issue. I have never used the toolkit for such long processing tasks, so this is a new issue indeed. We would have to test.

The post-processing step is there for convenience, automating several tasks that one has (or had) to do in depthmapX after the analysis, and indeed they can be easily reproduced in QGIS or PostGIS as they are all based on the columns in the results file.

joaoponceleao commented 6 months ago

Thanks for the input @jorgegil I am leaving the issue opene since, as far as I can tell, if the analysis was completed without errors and the analysis_result.txt file was generated, there should be no reason for the final tables (memory and postgis) to be empty, with correct columns but no values inserted. Something is going on here and it seems to only affect long processing (or r=n) tasks.

I did notice several errors in the source input layer after the fact though, and this is due to no geometry fix / cleaning steps after running the network segmenter tool. I will double check that failure to populate the table is reproducible with valid geometry and post an issue about the network segmenter. It should be quite simple to add a clean-up step to it.