Set ordering is non-deterministic, and thus, this serializes the generated _lextokens in random order. Adding a sorted(...) here fixes that.
Keeping the generated code deterministic comes with benefits: for hermetic build systems like Bazel which check the hashes of inputs/outputs, we can know when we don't need to rebuild code that depends on pycparser, and get better cache hits.
Note that _lextokens does not exist in upstream ply, so I patched it here directly. If/when pycparser uprevs to a newer ply, this won't matter anymore.
Set ordering is non-deterministic, and thus, this serializes the generated _lextokens in random order. Adding a sorted(...) here fixes that.
Keeping the generated code deterministic comes with benefits: for hermetic build systems like Bazel which check the hashes of inputs/outputs, we can know when we don't need to rebuild code that depends on pycparser, and get better cache hits.
Note that _lextokens does not exist in upstream ply, so I patched it here directly. If/when pycparser uprevs to a newer ply, this won't matter anymore.