Open banditsmile opened 2 months ago
I use gpt-crawler to crawl a 100k pages project, when crawled pages number goes to 4k, the project cost 4GB memory, meets npm memory limitation.
And the the project can't continue by reload.
It seems that 100k pages will cost 100GB memory.
I use gpt-crawler to crawl a 100k pages project, when crawled pages number goes to 4k, the project cost 4GB memory, meets npm memory limitation.
And the the project can't continue by reload.
It seems that 100k pages will cost 100GB memory.