Error when calling list(), bytes(), tuple(), bytearray() and some other types on tokenize.tokenize() and tokenize.generate_tokens() return object #110425
When calling list(), bytes(), tuple(), bytearray() and some other types (I didn't test them all) on tokenize.tokenize() and tokenize.generate_tokens() return objects, I get an error.
Example code:
import tokenize
list(t.tokenize("print(1)"))
Gives an error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\surfer\AppData\Local\Programs\Python\Python312\Lib\tokenize.py", line 440, in tokenize
encoding, consumed = detect_encoding(readline)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\surfer\AppData\Local\Programs\Python\Python312\Lib\tokenize.py", line 381, in detect_encoding
first = read_or_stop()
^^^^^^^^^^^^^^
File "C:\Users\surfer\AppData\Local\Programs\Python\Python312\Lib\tokenize.py", line 339, in read_or_stop
return readline()
^^^^^^^^^^
TypeError: 'str' object is not callable
So how can I convert the returned object into a list of lists/tuples/etc. that contain the token name, value and all other information, like this?:
[
[<token 1 data>],
[<token 2 data>],
...
]
This seems to happen on all type class calls that allow a generator object as argument.
You are using the tokenize function wrong. The argument it takes is a callable which provides the same interface as readline (https://docs.python.org/3/library/tokenize.html), not a string.
Bug description:
When calling list(), bytes(), tuple(), bytearray() and some other types (I didn't test them all) on tokenize.tokenize() and tokenize.generate_tokens() return objects, I get an error.
Example code:
Gives an error:
So how can I convert the returned object into a list of lists/tuples/etc. that contain the token name, value and all other information, like this?:
This seems to happen on all type class calls that allow a generator object as argument.
CPython versions tested on:
3.12
Operating systems tested on:
Windows