kartoza / docker-qgis-server

A dockerfile that contains a running QGIS server
68 stars 31 forks source link

QGIS Server Crashing with memory leak on Huge GeoJSON layer. #22

Open lucernae opened 5 years ago

lucernae commented 5 years ago

Problem

Due to the recent crash as described here: https://github.com/kartoza/geosafe/issues/538 , we found out that on a GeoJSON layer with big size (something like 60MB in test case), the memory usage rocket into 2GB to process these layers. Moreover, when it crash, apache doesn't release the resources, so it has memory leak.

Proposed Solutions

When processing huge GeoJSON layer like this, we might have to disable multiprocess thread. Let's say the map client requested 4 tiles in parallel. Usually one QGIS Server instance might take all these request and handle it by forking the process in a different thread/worker. But for this case, it causes these problem:

  1. All thread work with a different memory page (not shared), so for 60 MB GeoJSON file in test case, it ends up with 4x2GB memory consumptions.
  2. When the thread crash, the memory leak stays there, it's not being cleaned up.
  3. all 4 thread will try to access the same resource and race condition happens (because it was not quick enough to process tiles).

So I suggest the following solutions:

  1. Gave an option to use a separate apache config file where scaling is handled at the container level (scaling the instance, not apache thread)
  2. In this config, we only uses maximum of 1 worker (no race condition will happen in that container)
  3. In this config, we will not use keep alive. We want to close the connection and make the worker restart fresh to avoid memory leak.

CC @timlinux