psychopy / pyosf

A pure python library for simple sync with Open Science Framework
9 stars 5 forks source link

Need to handle failed uploads #5

Open peircej opened 7 years ago

peircej commented 7 years ago

I find relatively often that uploading files is resulting in a requests timeout I don't know whether that's an issue with the OSF.io server or whether it's a problem with pyosf code.

What is definitely a problem currently with pyosf is that when this occurs the local database still thinks the file was successfully sent. When it then discovers that the file isn't there remotely it thinks the file has been deleted at the server and deletes the local copy too! This is obviously bad. We need to fix by:

peircej commented 7 years ago

@lindemann09 wanted to make sure you're aware of this issue, in the interests of not losing files! :-)

peircej commented 7 years ago

The key problem is around line 245 of sync.py :

        # when local/remote updates are complete refresh index based on local
        proj.local.rebuild_index()
        proj.index = proj.local.index

At the end of sync this is assuming that sync was successful and sets the proj.index (i.e. the database) to be the same as the local.index. We need, instead, to track proj.index and change it as we need to on each successful operation. That will be ugly but safer in a world where the sync operations can fail.

lindemann09 commented 7 years ago

Thanks for the update. I didn't have the problem yet, but I'll keep an eye on it.

aubreymoore commented 6 years ago

I am having the same problem: pyosf wants to delete files on my local machine following failed uploads which occur frequently out here on the fringes of the net (I live on Guam).

As a workaround to prevent data loss, I always do print(changes) to examine proposed changes prior to executing changes.apply(). If the proposed changes indicate that local files are to be deleted, I execute changes = proj.get_changes() and print(changes) again.

pyosf is a great tool and I hope this bug can be fixed.