liuis / leveldb

Automatically exported from code.google.com/p/leveldb
BSD 3-Clause "New" or "Revised" License
0 stars 0 forks source link

IO error: db//081444.sst: Too many open files #45

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
Hi, all:

I use:

 leveldb::Iterator* it = db->NewIterator(leveldb::ReadOptions());
  for (it->SeekToFirst(); it->Valid(); it->Next()) {
  ...
  }

to process a leveldb, but meet the following error:
"IO error: db//081444.sst: Too many open files"

but if I process a smaller leveldb everything is ok, i thought the problem 
happens cause the "*.sst" is too many to process, but can't find any ideas to 
solve this problem in leveldb by google. 

Can anyone help me to resolve this problem? thanks a lot!

Original issue reported on code.google.com by GaryPY...@gmail.com on 9 Oct 2011 at 9:12

GoogleCodeExporter commented 9 years ago
You didn't specify which OS it is. You can try increasing the limit of opened 
files in your OS 
http://stackoverflow.com/questions/34588/how-do-i-change-the-number-of-open-file
s-limit-in-linux

Original comment by kkowalczyk@gmail.com on 9 Oct 2011 at 9:26

GoogleCodeExporter commented 9 years ago
thank you, i use the suse system.

Original comment by GaryPY...@gmail.com on 9 Oct 2011 at 9:50

GoogleCodeExporter commented 9 years ago
Just stumbled across this issue. Maybe documentation LevelDB should include 
estimates on how many open file descriptors will be used as a function of total 
data size. This could come handy for production planning. Just an idea.

Original comment by jens.ran...@gmail.com on 24 Nov 2012 at 6:08

GoogleCodeExporter commented 9 years ago
The number of open files in linux apparently can be bumped up to only a maximum 
of 1,048,576, which is insufficient for my use of levelDB.  Does anyone have 
any suggestions as to how I can get levelDB to limit the number of files it has 
open at any given time?  The number of open files continues to rise very slowly 
at first but there appears to be a point at which it suddenly begins rising 
very quickly.  I'm using levelDB to access several million records each about 
250 bytes in about a dozen different databases at the same time.  Is there some 
inherent limitation that I'm bumping up against?  This is on a linux system 
with 32 GBytes of real memory.

Original comment by stephen....@gmail.com on 3 Jan 2013 at 12:27

GoogleCodeExporter commented 9 years ago
I should note that it is an  x86_64 system.

Original comment by stephen....@gmail.com on 3 Jan 2013 at 12:35

GoogleCodeExporter commented 9 years ago
@stephen.p.morgan - Does options.max_open_files[1] not do what you want?

1. https://code.google.com/p/leveldb/source/browse/include/leveldb/options.h#84

Original comment by dgrogan@chromium.org on 14 Feb 2013 at 10:40

GoogleCodeExporter commented 9 years ago
Assuming fixed by options.max_open_files

Original comment by dgrogan@chromium.org on 29 May 2013 at 2:27

GoogleCodeExporter commented 9 years ago
I'm running leveldb 1.11.0 with max_open_files set to 512.

I have a large database that still seems to cause leveldb to open 10000+ files 
before it hits my per-process open file limit.

Any hints? Should I set max_open_files to something much smaller?

Original comment by fullung@gmail.com on 29 Jul 2013 at 12:15

GoogleCodeExporter commented 9 years ago
LevelDB's above a certain size (about 40 GB) seems to cause leveldb to open 
every single file in the database without closing anything in between.

Original comment by fullung@gmail.com on 15 Aug 2013 at 3:08