python / cpython

The Python programming language
https://www.python.org
Other
63.43k stars 30.37k forks source link

Built-in open function fail. Too many file open #45052

Closed ed9ddd6e-1866-4fd1-beb3-d0235f08af0e closed 17 years ago

ed9ddd6e-1866-4fd1-beb3-d0235f08af0e commented 17 years ago
BPO 1732686
Nosy @loewis, @birkenfeld

Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

Show more details

GitHub fields: ```python assignee = None closed_at = created_at = labels = ['invalid'] title = 'Built-in open function fail. Too many file open' updated_at = user = 'https://bugs.python.org/alexteo21' ``` bugs.python.org fields: ```python activity = actor = 'loewis' assignee = 'none' closed = True closed_date = None closer = None components = ['None'] creation = creator = 'alexteo21' dependencies = [] files = [] hgrepos = [] issue_num = 1732686 keywords = [] message_count = 6.0 messages = ['32240', '32241', '32242', '32243', '32244', '32245'] nosy_count = 3.0 nosy_names = ['loewis', 'georg.brandl', 'alexteo21'] pr_nums = [] priority = 'normal' resolution = 'not a bug' stage = None status = 'closed' superseder = None type = None url = 'https://bugs.python.org/issue1732686' versions = ['Python 2.3'] ```

ed9ddd6e-1866-4fd1-beb3-d0235f08af0e commented 17 years ago

Hi,

I have created a cron script using python. Every hour it will batch process certain files

e.g.

t=open(filename,'rb')
data=t.read()
#processing data...
t.close()

The script is working fine on the day of execution. It is able to process the files. However, when the next day comes, the processing fail!!

Traceback (most recent call last):
  File "./alexCopy.py", line 459, in processRequestModule
    sanityTestSteps(reqId,model)
  File "./alexCopy.py", line 699, in sanityTestSteps
    t = open(filename, 'rb')
IOError: [Errno 24] Too many open files:

I have explicitly closed the file. Please help.

Thanks, Alex

birkenfeld commented 17 years ago

Please do *not* open another bug when your first one was closed!

If you think closing it was not correct, reopen the first one. In this case, however, I referred you to comp.lang.python, so *please* post the issue (and your script) there.

ed9ddd6e-1866-4fd1-beb3-d0235f08af0e commented 17 years ago

Additional info I am running the script on solaris. If I have a similar script in tcl but there is no issue there. It only happens on python

birkenfeld commented 17 years ago

We can't even remotely guess at the source of your exception with that info, even using our best crystal balls.

But I see that you have posted on comp.lang.python, so let's see what comes out of it there.

ed9ddd6e-1866-4fd1-beb3-d0235f08af0e commented 17 years ago

Hi gbrandl,

I think I understand where the problem is. But I am not sure if this is a bug by Python. In the code, I am using pexpect module which spawn a child for FTP session. After the child is closed, the file descriptor is still open (check /proc/\<process>/fd)

I believe that this cause the too many file open issue. However, when I run the same thing on Linux, the file descriptor is closed properly

Is this the issue with the python pty module. It seems that the pty module on linux is more stable than on solaris

Very much appreciate your comments

Thanks, Alex

61337411-43fc-4a9c-b8d5-4060aede66d0 commented 17 years ago

It's most likely a bug in your application; you should invoke .close() on the spawn object. If you do invoke .close() and the connection still stays open, it's a bug in pexpect, please report it to the authors of pexpect. Closing as invalid.