Closed f0d513f8-bc73-483a-a52c-978a34f853aa closed 9 years ago
This bug is very similar to bpo-18879, the only difference is that _TemporaryFileWrapper.__iter is the problem (in bpo-18879, __getattr was fixed, but __iter__ was not). The real world use case that helped me find this bug is at the bottom of this report, this is a simple reproducer:
>>> import tempfile
>>> for l in tempfile.NamedTemporaryFile(mode='a+b'):
... print(l)
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: readline of closed file
I'm attaching a patch that fixes this (+ testcase).
Note: I actually discovered this while using
>>> from urllib.request import urlopen
>>> for l in urlopen('ftp://<some_ftp>'):
... print(l)
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: readline of closed file
Opening FTP uses urllib.response, which in turn uses tempfile._TemporaryFileWrapper, which makes this example fail.
I'm attaching second version of the patch. It now contains link to this bug and a more real test case according to suggestion from the review.
One of the reviews of first patch version mentioned that there should be a better explanation of how this issue can be provoked. I think that the test shows it nice and there's no need to explain it in greater detail. Also, the comment references this bug, where anyone can find the explanation. (Fix for bpo-18879 fixes pretty much the same in a different method and has very similar comment, so I think this should be fine.)
LGTM.
Agreed, the test is sufficient documentation. However, I can't make the test fail here (Windows 7, Python 3.4.3):
py ti.py b'spam\n' b'spam\n' b'eggs\n' b'eggs\n' b'beans\n' b'beans\n' cat ti.py import tempfile
def test_iter():
# getting iterator from a temporary file should keep it alive
# as long as it's being iterated over
lines = [b'spam\n', b'eggs\n', b'beans\n']
def make_file():
f = tempfile.NamedTemporaryFile(mode='w+b')
f.write(b''.join(lines))
f.seek(0)
return iter(f)
for i, l in enumerate(make_file()):
print(l, lines[i])
test_iter()
Is it somehow OS-specific?
Regardless, the patch seems fine and I have no problem with it being applied.
New changeset 7fa741fe9425 by Serhiy Storchaka in branch '3.4': Issue bpo-23700: Iterator of NamedTemporaryFile now keeps a reference to https://hg.python.org/cpython/rev/7fa741fe9425
New changeset c84a0b35999a by Serhiy Storchaka in branch 'default': Issue bpo-23700: Iterator of NamedTemporaryFile now keeps a reference to https://hg.python.org/cpython/rev/c84a0b35999a
Thank you for your contribution Bohuslav.
Thank you! To answer Paul's question: I honestly have no idea why this can't be reproduced on Windows. I managed to reproduce this in 100 % cases on various RPM-flavour Linux distros (Fedora, CentOS, RHEL) as well as on Debian and Ubuntu.
Cool, no problem.
test_csv now fails on Windows:
http://buildbot.python.org/all/builders/x86 Windows7 3.x/builds/9421/
\====================================================================== ERROR: test_read_dict_fieldnames_from_file (test.test_csv.TestDictFields) ----------------------------------------------------------------------
Traceback (most recent call last):
File "D:\cygwin\home\db3l\buildarea\3.x.bolen-windows7\build\lib\test\test_csv.py", line 629, in test_read_dict_fieldnames_from_file
self.assertEqual(next(reader), {"f1": '1', "f2": '2', "f3": 'abc'})
File "D:\cygwin\home\db3l\buildarea\3.x.bolen-windows7\build\lib\csv.py", line 110, in __next__
row = next(self.reader)
File "D:\cygwin\home\db3l\buildarea\3.x.bolen-windows7\build\lib\tempfile.py", line 431, in __iter__
yield from iter(self.file)
ValueError: I/O operation on closed file.
Following patch fixes the issue, but I don't understand why.
Maybe we need to keep explicitly a reference to self.file in the method (file = self.file) to keep it alive in the frame?
No, it doesn't help.
I think this is a consequence of PEP-380 and its decision to finalize the subgenerator when the delegating generator is closed. Consider this simple example without tempfile:
def yielder (fileobj):
yield from fileobj
with open('some_test_file', 'w') as f:
f.write('line one\nline two\nline three')
with open('some_test_file', 'r') as f:
line = next(yielder(f))
nline = next(f)
==>
Traceback (most recent call last):
File "<pyshell#11>", line 3, in <module>
nline = next(f)
ValueError: I/O operation on closed file.
I think test_csv does the file-closing operation on lines 626/627 when it creates the temporary csv.reader(fileobj).
def test_read_dict_fieldnames_from_file(self):
with TemporaryFile("w+") as fileobj:
fileobj.write("f1,f2,f3\r\n1,2,abc\r\n")
fileobj.seek(0)
reader = csv.DictReader(fileobj,
fieldnames=next(csv.reader(fileobj)))
self.assertEqual(reader.fieldnames, ["f1", "f2", "f3"])
self.assertEqual(next(reader), {"f1": '1', "f2": '2', "f3": 'abc'})
Actually, its scary that use of yield from can have such a subtle side-effect. Maybe PEP-380 should have taken this more seriously?
Ah yes, correct: when a generator using "yield from obj" is destroyed while yield from is not done, obj.close() is called if the method exists.
So "yield from file" *is* different than "for line in file: yield file" when we don't consume the whole generator.
A workaround is to create a wrapper class and returns it in the _TemporaryFileWrapper.__iter__() method: ---
class Iterator:
def __init__(self, obj):
self.obj = obj
def __next__(self):
if self.obj is None:
raise StopIteration
return next(self.obj)
def __iter__(self):
return self
def close(self):
self.obj = None
Or simply: ---
class Iterator:
def __init__(self, obj):
self.obj = obj
def __next__(self):
return next(self.obj)
def __iter__(self):
return self
This solution looks more complex than tempfile_iter_fix.patch.
@Serhiy: Maybe add a short comment to explain why yield from is not used and must be used. (By the way, the current comment contains "yields from" which is confusing :-))
Ah yes, correct: when a generator using "yield from obj" is destroyed while yield from is not done, obj.close() is called if the method exists.
But why obj.close() is called? The reference to fileobj is live, it shouldn't be closed.
This solution looks more complex than tempfile_iter_fix.patch.
Why you prefer more complex solution to simple solution?
@Serhiy
in this line of code:
reader = csv.DictReader(fileobj, fieldnames=next(csv.reader(fileobj)))
csv.reader(fileobj) returns the generator created by fileobj.__iter__, but no reference to it is kept so the object gets destroyed right afterwards. This closes the generator and because it uses yield from also the contained subgenerator, which is the file itself.
@wolma any idea why this only happens on Windows? I can't reproduce the CSV failing test on Linux.
@bkabrda not sure, but it may have to do with when exactly the object gets garbage collected
tempfile_iter_fix.patch looks good to me, can you commit it please?
csv.reader(fileobj) returns the generator created by fileobj.__iter__, but no reference to it is kept so the object gets destroyed right afterwards. This closes the generator and because it uses yield from also the contained subgenerator, which is the file itself.
Yes, there are no references to to the generator, created by fileobj.__iter__, but there are references to fileobj itself and to the file fileobj.file. I still don't understand why the file is closed. This looks as a bug.
Committed existing fix only to make buildbots green.
New changeset a90ec6b96af2 by Serhiy Storchaka in branch '3.4': Issue bpo-23700: NamedTemporaryFile iterator closed underlied file object in https://hg.python.org/cpython/rev/a90ec6b96af2
New changeset e639750ecd92 by Serhiy Storchaka in branch 'default': Issue bpo-23700: NamedTemporaryFile iterator closed underlied file object in https://hg.python.org/cpython/rev/e639750ecd92
so let's look at this step-by-step (and I hope I fully understood this myself):
calling fileobj.__iter__ creates a generator because the method uses yield from
that generator does not get assigned to any reference so it will be garbage-collected
when the generator is garbage-collected, the subgenerator specified to the rigth of the yield from is finalized (that is PEP-380-mandated behavior) and, in this case, that is iter(self.file)
for an io module-based fileobject, iter(f) is f and finalizing it means that its close method will be called
So this is not about the file object getting garbage-collected, it is about it getting closed.
Since PEP-380 explicitly mentions that problem with yield from and a shared subiterator, I don't think you can call it a bug, but I think it is very problematic behavior as illustrated by this issue because client code is supposed to know whether a particular generator uses yield from or not.
Thank you for your explanation Wolfgang! Now it is clear to me. The issue is that the generator calls the close() method of the subgenerator, but if the subgenerator is a file, the close() method closes (surprise!) the file. Two different protocols use the same method.
Interesting, how many similar bugs was introduced by blindly replacing "for/yield" with "yield from"?
Isn't there some discussion somewhere that if iter(x) returns x you probably have buggy code? Maybe it is io that is broken, design-wise. I think there was another issue related to iter(file) recently...someone surprised by the fact that you can't iterate a file twice without reopening it...the source of that problem is similar. Not that it necessarily soluble, but it is certainly *interesting* :)
Isn't there some discussion somewhere that if iter(x) returns x you probably have buggy code?
I agree that the issue comes from TextIOWrapper.__iter(), BufferedReader.__iter() and FileIO.__iter__() returns simply return self *and* have a close method. The issue is not "yield from".
You are probably right that the io classes are broken.
From https://docs.python.org/3/library/stdtypes.html#iterator-types:
Once an iterator’s __next__() method raises StopIteration, it must continue to do so on subsequent calls. Implementations that do not obey this property are deemed broken.
One consequence of __iter__ returning self is that the above is not guaranteed:
>>> with open('somefile', 'w') as f:
f.write('some text')
9
>>> with open('somefile', 'r') as f:
i = iter(f)
assert f is i
for line in i:
print(line)
try:
next(i)
except StopIteration:
print('exhausted iterator')
f.seek(0)
print(next(i))
some text exhausted iterator 0 some text
So the io classes are *officially* broken.
OTOH, it would be very hard to change the way fileobjects compared to designing yield from differently so I'd still blame it partly. Maybe it is unfortunate that generators have a close method instead of, say, __close__ ?
There's actually an issue about exactly that broken-per-docs issue for io. IMO if the goal of calling close is to close only things that are generator objects or pretending to be one, the method should have been named close_generator or something.
Comment in Lib/tempfile.py mentions issue bpo-23000, but should mention issue bpo-23700.
Indeed. And all the comment could be better.
Does anyone want to write better comment in the light of recent estimations?
How's this?
LGTM
New changeset e9f03315d66c by R David Murray in branch '3.4': bpo-23700: fix/improve comment https://hg.python.org/cpython/rev/e9f03315d66c
New changeset 64f4dbac9d07 by R David Murray in branch 'default': Merge: bpo-23700: fix/improve comment https://hg.python.org/cpython/rev/64f4dbac9d07
Yeah, the new comment is better :-) Thanks.
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields: ```python assignee = 'https://github.com/serhiy-storchaka' closed_at =
created_at =
labels = ['type-bug', 'library']
title = 'tempfile.NamedTemporaryFile can close too early if used as iterator'
updated_at =
user = 'https://bugs.python.org/bkabrda'
```
bugs.python.org fields:
```python
activity =
actor = 'vstinner'
assignee = 'serhiy.storchaka'
closed = True
closed_date =
closer = 'serhiy.storchaka'
components = ['Library (Lib)']
creation =
creator = 'bkabrda'
dependencies = []
files = ['38543', '38556', '38586', '38635']
hgrepos = []
issue_num = 23700
keywords = ['patch']
message_count = 35.0
messages = ['238451', '238505', '238506', '238510', '238511', '238512', '238513', '238518', '238575', '238606', '238611', '238613', '238614', '238615', '238617', '238619', '238621', '238622', '238623', '238664', '238677', '238678', '238683', '238691', '238697', '238698', '238699', '238700', '238757', '238895', '238911', '238917', '238918', '238919', '238929']
nosy_count = 12.0
nosy_names = ['georg.brandl', 'paul.moore', 'ncoghlan', 'vstinner', 'Arfrever', 'r.david.murray', 'ethan.furman', 'python-dev', 'serhiy.storchaka', 'bkabrda', 'sYnfo', 'wolma']
pr_nums = []
priority = 'normal'
resolution = 'fixed'
stage = 'resolved'
status = 'closed'
superseder = None
type = 'behavior'
url = 'https://bugs.python.org/issue23700'
versions = ['Python 3.4', 'Python 3.5']
```