mnutt / davros

Personal file storage server
Apache License 2.0
298 stars 35 forks source link

Space usage report differs between owncloud client and davros web interface #36

Open ovizii opened 8 years ago

ovizii commented 8 years ago

How come? Btw. publishing is disabled. Is there some versioning going on here? A recycling bin somewhere? I'm at a loss.

2016-03-21 13_18_25-owncloud

paulproteus commented 8 years ago

Hi! This is likely due to the ownCloud client calculating only the size of the files, but the Sandstorm grain using up space for other things. I don't personally know what other things - I'd have to go read the Davors source to find out better.​

mnutt commented 8 years ago

That is certainly curious. The only thing I can think of is that when you send larger files (> 5MB) they get sent in chunks and davros stores those in a temp directory and then assembles the pieces. It's possible that it's not correctly cleaning them up.

ovizii commented 8 years ago

Hm, so I went into the grain via ssh and found this, does that help you figure it out?

root@servername:/opt/sandstorm/var/sandstorm/grains/*somegrain*/sandbox/davros# du -cksh * 79M data 96M tmp 174M total root@servername:/opt/sandstorm/var/sandstorm/grains/*somegrain*/sandbox/davros# du -cksh tmp/* 5.0M tmp/2663229824 5.0M tmp/2663231173 15M tmp/2797048495 71M tmp/3034574969 4.0K tmp/jsdav 96M total

Anything else I can supply to figure it out? This davros instance is only used in 2 locations and not really actively. There's hardly a few files changing every other week.

edit

sorry for the bad formatting, not sure how to fix it :-(

mnutt commented 8 years ago

That sounds like we're not cleaning up tempfiles properly. Though I don't know what could make up the difference between 174MB and the 224MB shown in the grain status.

ovizii commented 8 years ago

I can always destroy my grain and copy my stuff into a new grain, not a problem. Just wanted to check if I can help debug it somehow. I'll keep the grain for a while, maybe you'll come up with some idea.

Otherwise, feel free to close this issue.