python / cpython

The Python programming language
https://www.python.org
Other
63.51k stars 30.42k forks source link

weak dict iterators are fragile because of unpredictable GC runs #51354

Closed pitrou closed 10 years ago

pitrou commented 15 years ago
BPO 7105
Nosy @mdickinson, @pitrou, @kristjanvalur, @devdanzin, @jparise, @benjaminp, @asqui
PRs
  • python/cpython#20687
  • Files
  • weakiter3.patch
  • issue7105.patch
  • weakref.patch
  • Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

    Show more details

    GitHub fields: ```python assignee = None closed_at = created_at = labels = ['type-bug', 'library'] title = 'weak dict iterators are fragile because of unpredictable GC runs' updated_at = user = 'https://github.com/pitrou' ``` bugs.python.org fields: ```python activity = actor = 'gvanrossum' assignee = 'none' closed = True closed_date = closer = 'kristjan.jonsson' components = ['Library (Lib)'] creation = creator = 'pitrou' dependencies = [] files = ['15114', '32689', '32932'] hgrepos = [] issue_num = 7105 keywords = ['patch'] message_count = 30.0 messages = ['93863', '93865', '93912', '93934', '97392', '97425', '111259', '142392', '203250', '203286', '203318', '203345', '203349', '203354', '203355', '203356', '204975', '204980', '205013', '205014', '205019', '205090', '205092', '205093', '205124', '205215', '205224', '205285', '217444', '266818'] nosy_count = 14.0 nosy_names = ['dcjim', 'tseaver', 'mark.dickinson', 'pitrou', 'kristjan.jonsson', 'ajaksu2', 'jon', 'benjamin.peterson', 'vdupras', 'elachuni', 'neologix', 'python-dev', 'qelan', 'dfortunov'] pr_nums = ['20687'] priority = 'normal' resolution = 'fixed' stage = None status = 'closed' superseder = None type = 'behavior' url = 'https://bugs.python.org/issue7105' versions = ['Python 2.6', 'Python 2.7'] ```

    pitrou commented 15 years ago

    As mentioned in bpo-7060, weak dict iterators are easily broken by cyclic garbage collection changing the size of the underlying dict storage:

    File "/home/rdmurray/python/py3k/Lib/weakref.py", line 121, in items for wr in self.data.values(): RuntimeError: dictionary changed size during iteration

    One possible solution is to delay all removals until all iterators are done. Here is a context manager-based solution.

    gvanrossum commented 15 years ago

    delay all removals until all iterators are done

    +1

    pitrou commented 15 years ago

    This new patch makes it possible to mutate the dict without messing with the delayed removal when an iterator exists.

    pitrou commented 15 years ago

    It occurs weaksets have the same problem, here is a new patch fixing them as well.

    benjaminp commented 14 years ago

    LGTM

    pitrou commented 14 years ago

    Committed in r77365 (py3k) and r77366 (3.1). Thank you.

    mdickinson commented 14 years ago

    The issue still exists in 2.6 and 2.7.

    Closing bpo-839159 as a duplicate of this one.

    3461ee34-d1fa-4dd1-9a3b-c01fef01762b commented 13 years ago

    Any plans on actually patching this in 2.7 any time soon? This is affecting our software and hanging it on random occasions.

    vstinner commented 11 years ago

    What is the status of this issue? Does anyone still want to backport the fix to Python 2.7?

    (I found this issue while searching for test_multiprocessing failures, and I found bpo-7060 which refers this issue.)

    84401114-8e59-4056-83cb-632106c0b648 commented 11 years ago

    Attached is a version for 2.7

    Btw, I think a cleaner way to deal with unpredictable GC runs is to be able to temporarily disable GC with a context manager.

    pitrou commented 11 years ago

    Btw, I think a cleaner way to deal with unpredictable GC runs is to be able to temporarily disable GC with a context manager

    Disabling the GC in a library function sounds very ugly to me.

    84401114-8e59-4056-83cb-632106c0b648 commented 11 years ago

    No matter how it sounds, it certainly looks cleaner in code. Look at all this code, designed to work around an unexpected GC collection with various pointy bits and edge cases and special corners.

    Compare to explicitly just asking GC to relent, for a bit: def getitems(self): with gc.disabled(): for each in self.data.items(): yield each

    That's it.

    While a native implementation of such a context manager would be better (faster, and could be made overriding), a simple one can be constructed thus: @contextlib.contextmanagerd def gc_disabled(): enabled = gc.isenabled() gs.disable() try: yield finally: if enabled: gc.enable()

    Such global "atomic" context managers are well known to stackless programmers. It's a very common idiom when building higher level primitives (such as locks) from lower level ones. with stackless.atomic(): do() various() stuff_that_does_not_like_beinginterrupted() (stackless.atomic prevents involuntary tasklet switching _and involuntary thread switching)

    pitrou commented 11 years ago

    No matter how it sounds, it certainly looks cleaner in code.

    It's also unsafe and invasive, since it's a process-wide setting. An iterator can be long-lived if it's being consumed slowly, so you've disabled garbage collection for an unknown amount of time, without the user knowing about it. Another thread could kick in and perhaps re-enable it for whatever reason.

    Oh if someone calls gc.collect() explicitly, the solution is suddenly defeated.

    84401114-8e59-4056-83cb-632106c0b648 commented 11 years ago

    Yes, the "long iterator" scenario is the reason it is not ideal for this scenario. The other one (gc.collect()) is easily solved by implementing this construct natively. It can be done rather simply by adding an overriding "pause" property to gc, with the following api:

    def pause(increment):
      """
      pause or unpause garbage collection.  A positive value
      increases the pause level, while a negative one reduces it.
      when paused, gc won't happen even when explicitly requested with
      gc.collect(), until the pause level drops to 0.
      """

    I'm sure there are other places in the code with local execution that would benefit from not having an accidental GC run happen. I'm sure I've seen such places, with elaborate scaffolding to safeguard itself from such cases.

    Anyway, my 2 aurar worth of lateral thinking applied to the problem at hand :)

    What about the patch itself?

    pitrou commented 11 years ago

    Speaking of which, this is not only about GC runs, although it is the most annoying scenario (since you basically don't control when it happens). Regular resource reaping because of reference counting can also wreak havoc.

    About the patch, I don't know. It introduces new complexity in 2.7 which should be fairly stable by now. The one thing I don't like is your replacement of "iterator" by "iterable" in the docs.

    (you also have one line commented out in the tests)

    84401114-8e59-4056-83cb-632106c0b648 commented 11 years ago

    The changes for the docs are just a port of the original patch. And indeed, these functions don't return an iterator, but a generator object.

    I admit I'm confused by the difference, since next() can be called directly on the generator. Still, a lot of code in the test explicitly calls iter() on the iterators before doing next(). Not sure why.

    The commented out line is an artifact, I'll remove it, the correct one is the test for 20 items.

    Otherwise, I have no vested interest in getting this in. My porting this is just me contributing to 2.7. If it's vetoed, we'll just put it in 2.8.

    84401114-8e59-4056-83cb-632106c0b648 commented 10 years ago

    Here's a different approach. Simply avoid the use of iterators over the underlying container. Instead, we iterate over lists of items/keys/values etc.

    gvanrossum commented 10 years ago

    I appreciate the simplicity, but I don't think it is acceptable -- if the dict is large, materializing the entire list of keys/values/items might allocate a prohibitive amount of memory. It's worse if you have code that is expected to break out of the loop.

    84401114-8e59-4056-83cb-632106c0b648 commented 10 years ago

    Yes, the old memory argument. But is it valid? Is there a conceivable application where a dict of weak references would be storing a large chunk of the application memory? Remember, all of the data must be referred to from elsewhere, or else, the weak refs would not exist. An extra list of pointers is unlikely to make a difference. I think the chief reason to use iterators has to do with performance by avoiding the creation of temporary objects, not saving memory per-se.

    Before the invention of "iteritems()" and friends, all such iteration was by lists (and hence, memory usage). We should try to remain nimble enough so that we can undo an optimization previously done, if the requirements merit us doing so.

    As a completely unrelated example of such nimbleness: Faced with stricter regulations in the 70s, american car makers had to sell their muscle cars with increasingly less powerful engines, efectively rolling back previous optimizations :)

    Anyway, it's not for me to decide. We have currently three options: a) my first patch, which is a duplication of the 3.x work but is non-trivial and could bring stability issues b) my second patch, which will increase memory use, but to no more than previous versions of python used while iterating c) do nothing and have iterations over weak dicts randomly break when an underlying cycle is unraveled during iteration.

    Cheers!

    pitrou commented 10 years ago

    Anyway, it's not for me to decide. We have currently three options: a) my first patch, which is a duplication of the 3.x work but is non-trivial and could bring stability issues b) my second patch, which will increase memory use, but to no more than previous versions of python used while iterating c) do nothing and have iterations over weak dicts randomly break when an underlying cycle is unraveled during iteration.

    Either a) or c), for me. We shouldn't change semantics in bugfix releases.

    gvanrossum commented 10 years ago

    I'm with Antoine. Have we heard of any problems with the 3.x version of the patch? How different is it?

    84401114-8e59-4056-83cb-632106c0b648 commented 10 years ago

    Strictly speaking b) is not a semantic change. Depending on your semantic definition of semantics. At any rate it is even less so than a) since the temporary list is hidden from view and the only side effect is additional memory usage.

    84401114-8e59-4056-83cb-632106c0b648 commented 10 years ago

    d), We could also simply issue a (documentation) warning, that the "iterator" methods of these dictionares are known to be fragile, and recommend that people use the keys(), values() and items() instead.

    pitrou commented 10 years ago

    d) sounds like a good enough resolution at this point.

    gvanrossum commented 10 years ago

    I'm not sure I understand the hesitation about backporting the Python 3 solution. We're acknowledging it's a bug, so the fix is not a feature. The Python 3 solution is the future. So why not fix it?

    84401114-8e59-4056-83cb-632106c0b648 commented 10 years ago

    That's the spirit, Guido :) I just think people are being extra careful after the "regression" introduced in 2.7.5. However, IMHO we must never let the odd mistake scare us from making necessary moves. Unless Antoine explicitly objects, I think I'll submit my patch from november and we'll just watch what happens.

    pitrou commented 10 years ago

    I'm ok with the backport.

    1762cc99-3127-4a62-9baf-30c3d0f51ef7 commented 10 years ago

    New changeset 03fcc12282fc by Kristján Valur Jónsson in branch '2.7': Issue bpo-7105: weak dict iterators are fragile because of unpredictable GC runs http://hg.python.org/cpython/rev/03fcc12282fc

    pitrou commented 10 years ago

    Since this is backported, shouldn't it be closed?

    79528080-9d85-4d18-8a2a-8b1f07640dd7 commented 8 years ago

    Shouldn't the documentation be updated?

    https://docs.python.org/3.6/library/weakref.html#weakref.WeakKeyDictionary

    Note Caution: Because a WeakKeyDictionary is built on top of a Python dictionary, it must not change size when iterating over it. This can be difficult to ensure for a WeakKeyDictionary because actions performed by the program during iteration may cause items in the dictionary to vanish “by magic” (as a side effect of garbage collection).