Open gstenzel opened 3 years ago
You have an error in your environment:
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
Please follow this instruction step by step, making sure you have the Apache Spark/Hadoop setup correctly on Windows. (Every step matters and unfortunately, it's a bit long for Windows)
https://github.com/JohnSnowLabs/spark-nlp/discussions/1022
Before doing pip install nlu pyspark==2.4.7
openjdk version "1.8.0_282"
(equivalent to JDK 8)>>> import nlu
without errorsSame errors occure when running
nlu.load('tokenize').predict('Each word and symbol in a sentence will generate token.')
Full stack trace: