Open MhhhxX opened 1 year ago
I could fix the problem with downgrading the numpy
package but then I run into another problem with the protobuf
package:
Traceback (most recent call last):
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/user/lit/lit_nlp/examples/toxicity_demo.py", line 19, in <module>
from lit_nlp.examples.datasets import classification
File "/home/user/lit/lit_nlp/examples/datasets/classification.py", line 8, in <module>
import tensorflow_datasets as tfds
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/__init__.py", line 43, in <module>
import tensorflow_datasets.core.logging as _tfds_logging
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/__init__.py", line 22, in <module>
from tensorflow_datasets.core import community
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/community/__init__.py", line 18, in <module>
from tensorflow_datasets.core.community.huggingface_wrapper import mock_builtin_to_use_gfile
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/community/huggingface_wrapper.py", line 31, in <module>
from tensorflow_datasets.core import dataset_builder
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/dataset_builder.py", line 34, in <module>
from tensorflow_datasets.core import dataset_info
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/dataset_info.py", line 50, in <module>
from tensorflow_datasets.core import splits as splits_lib
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/splits.py", line 34, in <module>
from tensorflow_datasets.core import proto as proto_lib
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/proto/__init__.py", line 18, in <module>
from tensorflow_datasets.core.proto import dataset_info_generated_pb2 as dataset_info_pb2 # pylint: disable=line-too-long
File "/home/user/.conda/envs/lit-nlp/lib/python3.9/site-packages/tensorflow_datasets/core/proto/dataset_info_generated_pb2.py", line 22, in <module>
from google.protobuf.internal import builder as _builder
ImportError: cannot import name 'builder' from 'google.protobuf.internal' (/home/max/.conda/envs/lit-nlp/lib/python3.9/site-packages/google/protobuf/internal/__init__.py)
Downgrading and upgrading protobuf
didn't help as it breaks other dependencies.
I could also solve the second problem by applying the steps suggested in that stackoverflow post.
@MhhhxX Are you still running into this issue after our (somewhat) recent release? That release did a lot to pin down Python version numbers in dependencies to address breakages.
I installed
lit
as described in the section Install from source. Every command succeeded, also building the frontend withyarn
.Then I tried to run a demo from the examples module within the conda environment with that command:
python -m lit_nlp.examples.penguin_demo --port=4321 --quickstart
. I received the following error:Do you have any ideas how to fix that?