Open Wei-Liao opened 3 years ago
Hello,
I don't know about your case but if you are using not continuous parcels but sparse polygons you should use building centroids (points) and a distance-based spatial weight matrix to obtain results. The plugin uses a contiguity-based matrix in case of the input layer is a polygon which is not useful if the polygons are not connected in space.
Besides that, I would suggest using <10,000 features for running the plugin, especially when using points, if you want to see some result in a reasonable time.
Performance is connected to the capability of PySAL mainly and from my experience the bottleneck is always the generation of the spatial weight matrix (the plugin use the default function to create it while some more advanced options may be available through Python scripting).
Another possible solution that not implies coding would be to generate the spatial weight matrix using GeoDa and store it in a file. Then use this file as input for I and G computation again from GeoDa.
Hope this will be useful.
Daniele
Firstly, thanks for such useful tool! It's not really an issue, but a question I have in mind. Currently I am working on a project that requires hot spot analysis (for both local GI* and Moran's I). My data is huge, it's a shape file with 600,000 buildings as polygons with attributes. Obviously I expected it to take a long time for a single run. However, it never ends and I couldn't even tell if it has crashed or not. My computer equipped with a CPU 3970X and 256 G memory installed. But since multiprocessing is not really possible, so the hardware power is kind of irrelevant. So my question is, currently what is the maximum number of polygons approximately the tool can handle? Thank you so much!