jupyter-incubator / sparkmagic

Jupyter magics and kernels for working with remote Spark clusters
Other
1.33k stars 447 forks source link

[BUG] TypeError: required field "type_ignores" missing from Module #748

Closed BertrandBrelier closed 1 year ago

BertrandBrelier commented 2 years ago

Describe the bug Hello everybody, I would appreciate your help regarding the following issue:

I got the following error message when running more than one line of code in a jupyter cell on a remote Spark cluster:

An error was encountered: required field "type_ignores" missing from Module Traceback (most recent call last): File "/home/user/tmpdata/nm-local-dir/usercache/user/appcache/application_1644356194603_0003/container_1644356194603_0003_01_000001/tmp/3472037301374113689", line 223, in execute code = compile(mod, '', 'exec') TypeError: required field "type_ignores" missing from Module

To Reproduce run the following code in 1 jupyter cell:

print("5") print("6")

or

import numpy as np np.mean([1,2])

the code is running if each line is executed in a separate Jupyter Cell.

Expected behavior Expected to run the entire cell with multiple lines of code

Screenshots SparkMagicError

Versions:

Additional context Add any other context about the problem here.

Thank you for your help,

BertrandBrelier commented 2 years ago

This is actually a Livy issue impacted by the change of ast.Module requiring an additional argument in python 3.8: https://issues.apache.org/jira/browse/LIVY-795

Solution: update the fake_shell.py file from

for node in to_run_exec:
    mod = ast.Module([node])

To:

for node in to_run_exec:
    if sys.version_info >= (3,8):
          mod = ast.Module([node], type_ignores=[])
    else:
          mod = ast.Module([node])

Updating the fake_shell.py file and recompilling Livy fixed the issue.

Bertrand