ulixee / hero

The web browser built for scraping
MIT License
647 stars 32 forks source link

Heap overflow with 300 request/min #248

Open blakebyrnes opened 5 months ago

blakebyrnes commented 5 months ago

From discord:

Running into some heap overflow errors when trying to deploy ulixee cloud, it's saying it hit the 2079mb limit Has anyone had this experience? I'm running around 20-35 concurrent heros, but doing around 300 resource requests per min Seems to work fine for 20 minutes but then crashes consistently. The overflow message mentions sqlite3

Hey @blakeb ! I've done some heap analysis, and it seems that some data is being retained even after hero.close() The set up I'm using is loading a web page, running ~150 fetches and then closing the session. The killer part is that the fetches are graphql so they're HUGE request size and response size. In the screenshots shown the heap reached 1700mb after cycling through quite a lot of Heros. At least 600mb of that heap is made of strings (graphql query body) Happy to share an implementation example or any more details image image image