Closed Nishimara closed 5 months ago
Hi there! This is a known issue, big cities can still cause some really big trouble. There's also still a memory bug in some cases. I already put multiprocessing on the ToDo list. There are some problems with that for which I haven't figured out a solution yet, but I'm working on it. The step you mentioned will also be improved by that. In the meantime you can try to minimize the area you want to generate by adding a parameter in the query in src/getData.py, like for example a zip code in case that Saint Petersburg is split up in multiple districts. That would definitely be a little workaround.
Can I ask you how long it took until it reached the Calculating Layers step? I'm impressed that it even got that far haha!
Thanks for answer. I can't tell how much i waited, but i managed to sleep before it get to calculating layers step.
@louis-e I found a way to optimize this software in a way that generating cities as large as in the picture below is possible. by using memory maps instead of storing in memory, you could generate maps as large as you like. I'll send a pull request soon
Thanks to @amir16yp this issue should be solved now, thus I will close it as completed!
Generating big cities is kinda of a hell, because its using only one core. It's stuck on "Calculating layers" while recreating Saint-Petersburg.