gbook / nidb

NeuroInformatics Database
GNU General Public License v3.0
26 stars 9 forks source link

Too many files #95

Closed andersonwinkler closed 3 years ago

andersonwinkler commented 3 years ago

Hi Greg,

The partition we use for /nidb/data/archive has 20TB, of which we are using a little over 10TB. However, we've run out of inodes (100%!), then we can create any new file.

Your instance at ONRC has surely more files. How to deal with this?

Thanks!

Cheers,

Anderson

gbook commented 3 years ago

What filesystem are you using? I believe we are using zfs with around 20TB of raw data.

Also, you may want to attempt an upgrade of your insurance to the current version of nidb. It went fix the inode issue, but there are lots of bug fixes and improvements since last summer.

On Sat, May 29, 2021, 1:29 PM A. M. Winkler @.***> wrote:

Hi Greg,

The partition we use for /nidb/data/archive has 20TB, of which we are using a little over 10TB. However, we've run out of inodes (100%!), then we can create any new file.

Your instance at ONRC has surely more files. How to deal with this?

Thanks!

Cheers,

Anderson

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/gbook/nidb/issues/95, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB7K5B455ZUWA2QY42HZOZDTQEP57ANCNFSM45YPPWYQ .

andersonwinkler commented 3 years ago

Thanks for the quick response. I think we are using ext4. I'll move the data off, format as zfs, and put them back.

Will upgrade too, many thanks!