AnantLabs / big5sync

Automatically exported from code.google.com/p/big5sync
1 stars 0 forks source link

Thread dies after synching large volume of files #29

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. Sync between a large file and a folder without any files.
2. The process dies and no files are being copied over
3.

What is the expected output? What do you see instead?

By right , all files from the large volume of files should be copied over.
The reason for this is that MD5 takes too long to calculate the hash for 
each file , therefore the thread might have died along the way.

Please use labels and text to provide additional information.

Original issue reported on code.google.com by aurigatr...@hotmail.com on 16 Mar 2010 at 3:51

GoogleCodeExporter commented 9 years ago
Changed hashing from MD5 to Adler32 for speed (might not be as accurate but 
collision 
rate is very low). Please retest. Method name is unchanged (prevent code 
breakage for 
now)

Original comment by sohyuanc...@gmail.com on 16 Mar 2010 at 4:41

GoogleCodeExporter commented 9 years ago
thread will not die now. but will take a bit of time though.

Original comment by gohkhoon...@gmail.com on 30 Mar 2010 at 6:58

GoogleCodeExporter commented 9 years ago

Original comment by gohkhoon...@gmail.com on 31 Mar 2010 at 12:07