Open frankrousseau opened 9 years ago
Do we know why the server crashed ? Is it related to the number of files, or to the total size ?
Sorry, I don't know. Is that something you can reproduce?
So, I created 10.000 files of only 1o and tried to upload them. Firefox crashed after ± 1:30, after eating all my memory. There seems to be some memory leak on frontend (to be confirmed, maybe is it in the browser, not in our code).
After 1 h 30, only 1300 files where uploaded (I tried on my local VM, so this is not a network issue). Not that much.
Looking at CouchDB logs, I see 26 GET request (in around 2s) for every uploaded file !
So I think there's probably some place for improvements ;-)
(FTR: to create the 10.000 files, I used this command : for i in {1..10000}; do dd if=/dev/urandom bs=1 count=1 of=file$i; done
) )
BTW, congrats @jsilvestre for the work you did on displaying folder with lot of files : it works perfectly \o/
I can confirm that uploading large collection of elements is not efficient, on our side. There are constraints, like detecting if there are files to overwrite, but here are some inputs:
There are probably also some optimization to perform on server side. I can confirm that when I upload a single file, I see 27 requests for this file in the logs of CouchDB (event if the indexer is stopped).
Some optimizations are in the DS :
Not sure where the 17 others come from :-/
A user complained that he wasn't able to upload his picture folder of 8000 files (his whole Cozy crashed).
We should (at least):