Open filemon11 opened 2 years ago
Hi @filemon11 this looks like Spark is not properly setup on Windows, can you make sure you follow all steps involved to install spark-nlp and pyspark for windows?
The windows setup can be a bit tricky, but if you are just getting started we recommend to use google colab, which provided you instantly with a working environment in your browser
https://colab.research.google.com/drive/1j4Ek0JkBPmnK75qIxyYjVtYWNUPRbh9v?usp=sharing
@C-K-Loan would it be possible to please share some inputs against Windows installation please ?
I get the following error when trying the following:
using configuration: OS: Windows 10 Java version: 1.8.0_311 (Java 8) Pyspark – version: 3.1.2
Traceback (most recent call last): File "D:.venv\python3.8_nlu\lib\site-packages\nlu__init__.py", line 236, in load nlu_component = nlu_ref_to_component(nlu_ref, authenticated=is_authenticated) File "D:.venv\python3.8_nlu\lib\site-packages\nlu\pipe\component_resolution.py", line 171, in nlu_ref_to_component resolved_component = resolve_component_from_parsed_query_data(language, component_type, dataset, File "D:.venv\python3.8_nlu\lib\site-packages\nlu\pipe\component_resolution.py", line 320, in resolve_component_from_parsed_query_data raise ValueError(f'EXCEPTION : Could not create NLU component for nlp_ref={nlp_ref} and nlu_ref={nlu_ref}') ValueError: EXCEPTION : Could not create NLU component for nlp_ref=elmo and nlu_ref=elmo
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "", line 1, in
File "D:.venv\python3.8_nlu\lib\site-packages\nlu__init__.py", line 255, in load
raise Exception(
Exception: Something went wrong during loading and fitting the pipe. Check the other prints for more information and also verbose mode. Did you use a correct model reference?