I have a huge local repo and it takes the sourcerer app about 5 hours for hashing.
The problem is that after about 40 minutes the status of the hashing process on the server is FAILED with reason "Timeout reached".
The local app keeps on hashing for several hours, but after finishing this step, all I get is an error message saying that there was a error communicating with the server and the status on the website changes to FAILED with reason "Processing error".
Wouldn't it be better to locally store the hashing results and upload them at the end instead of sending "keepalives" all the time?
Yeah, I'm having this same issue. I have about 5 large local repos and it can only get through the smallest of them. Having to skip a lot of commits to get it to finish within the timeout window.
Hi,
I have a huge local repo and it takes the sourcerer app about 5 hours for hashing. The problem is that after about 40 minutes the status of the hashing process on the server is FAILED with reason "Timeout reached". The local app keeps on hashing for several hours, but after finishing this step, all I get is an error message saying that there was a error communicating with the server and the status on the website changes to FAILED with reason "Processing error".
Wouldn't it be better to locally store the hashing results and upload them at the end instead of sending "keepalives" all the time?