Closed KaushikSathvara closed 2 years ago
this is expected behaviour. independent works will allocate memory much more. unfortunately, you cannot share the memory between them.
@serengil, thank you for developing Deepface. I am in a similar situation, what would be the alternative solution here to avoid high memory usage in Celery workers? @KaushikSathvara how did you end up implementing this?
Reduced parallel worker for now @shafay07 But still finding a way to handling situation
I have created a task and loading a deepface model in the task method. But when it loads more celery workers it causes high memeory uses as each celery worker loading a seprate deepface model in it's context. Can we share context between those workers to load a deepface model once and predicting result on each worker.
EDIT: Please take a look at following psudo code for ref.
Please note that each time above function executes it builds the model and counsume the lot of ram