vlasovskikh / funcparserlib

Recursive descent parsing library for Python based on functional combinators
https://funcparserlib.pirx.ru
MIT License
338 stars 38 forks source link

make_tokenizer tokenisers return generators, Parser expects sequences #44

Closed gsnedders closed 1 year ago

gsnedders commented 8 years ago
from funcparserlib.lexer import make_tokenizer
from funcparserlib.parser import some

tokenize = make_tokenizer([
    (u'x', (ur'x',)),
])

some(lambda t: t.type == "x").parse(tokenize("x"))

results in

Traceback (most recent call last):
  File "/Users/gsnedders/Documents/other-projects/funcparserlib/funcparserlib/funcparserlib/tests/test_parsing.py", line 76, in test_tokenize
    some(lambda t: t.type == "x").parse(tokenize("x"))
  File "/Users/gsnedders/Documents/other-projects/funcparserlib/funcparserlib/funcparserlib/parser.py", line 121, in parse
    (tree, _) = self.run(tokens, State())
  File "/Users/gsnedders/Documents/other-projects/funcparserlib/funcparserlib/funcparserlib/parser.py", line 309, in _some
    if s.pos >= len(tokens):
TypeError: object of type 'generator' has no len()

tokenize("x") is a generator, and you can't call len on a generator.

jtprobst commented 7 years ago

I'm facing the same issue. Are there any updates?

jtprobst commented 7 years ago

I have temporarily worked around it by doing parse(list(tokenize(s)))

gsnedders commented 1 year ago

Per @vlasovskikh in https://github.com/vlasovskikh/funcparserlib/pull/45#issuecomment-821788078, this is intended behaviour, which yes, means if you can't pass the return value of make_tokenizer to parse and instead need to wrap it in list.