Open juanep97 opened 1 year ago
Could you describe what you did in a bit more detail please?
I mounted a directory with some 100mb-300mb files in it and executed a program that needs to read them / load several of them. When this program executes in a single thread, it just takes time and they are downloaded and cached normally. But when this program executes in parallel (~6 processes), I get the error message above (using httpdirfs -f --cache).
What program is this? Which HTTP server are you using? I need to be able to reproduce your problem at my local end.
Also could you compile HTTPDirFS with this SHA 1e808448313b96b6080cc659a8929f4235ee5cb4 and see if the problem still occurs.
Hi, sorry for the long delay, I've been quite busy with academic life. I use HTTPDirFS as a workaround in https://github.com/juanep97/iop4 to mount a remote archive, and avoid having to download ~35GB of files during the CI. But as you can see, tests fail most of the time, because the program does not find the file where it should be / it stopped being mounted at some point (maybe after a connection error?). The version I use is the one in the ubuntu repo. I will try to check if the problem still occurs using that commit you mention.
No worries, I am a busy person too. ~I actually have a rough suspicion of what might be causing the problem. Since you are using this for science, I am going to put this on higher priority. When do you need this to be done by?~
Edit: Apologies, I got confused this issues with the other two issues that's due to curl_easy_escape
. Debugging lock problem is not exactly easy.
Is this issue only observed within the CI, or does it happen in real life too? Could you point me to a CI run where this failure occur?
You could try and run httpdirfs in single thread mode by passing in -s
. This is mentioned in the README under "Useful FUSE options"
I'm trying to access (read) a file from different processes, but httpdirfs (-f --cache) crashes:
PS: thanks for the nice software!