ufal / udpipe

UDPipe: Trainable pipeline for tokenizing, tagging, lemmatizing and parsing Universal Treebanks and other CoNLL-U files
Mozilla Public License 2.0
359 stars 75 forks source link

Unable to install UDPipe 2 as a local REST server #184

Closed Shasetty closed 8 months ago

Shasetty commented 9 months ago

The attached error message is not complete, so it is hard to tell, but it seems your environment has some package version mismatch. I would remove the Python virtual environment and reinstall it.

installed python packages/ version Python 3.10.12


Other installed packages are:-

tensorflow_gpu~=2.10.0 also tried with version 2.10.1 and 2.12.0

ufal.chu_liu_edmonds

ufal.udpipe>=1.3,<2

transformers 4.36.2

was able to run ./start_wembeddings_server.py even after that the error is still there.

2024-01-07 16:53:18.770740: I external/local_tsl/tsl/cuda/cudart_stub.cc:31] Could not find cuda drivers on your machine, GPU will not be used. 2024-01-07 16:53:18.990023: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2024-01-07 16:53:18.990103: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2024-01-07 16:53:19.039740: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered 2024-01-07 16:53:19.142172: I external/local_tsl/tsl/cuda/cudart_stub.cc:31] Could not find cuda drivers on your machine, GPU will not be used. 2024-01-07 16:53:19.143352: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2, in other operations, rebuild TensorFlow with the appropriate compiler flags. 2024-01-07 16:53:20.150933: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT Traceback (most recent call last): File "/home/sachi/sumanth.k/udpipe-2.1.0/udpipe2_server.py", line 29, in import wembedding_service.wembeddings.wembeddings as wembeddings ModuleNotFoundError: No module named 'wembedding_service.wembeddings'

foxik commented 9 months ago

Hi,

the wembedding_service/wembeddings/wembeddings.py is present in the UDPipe-2 repository; however it is a submodule, so maybe you did not check it out? I.e., is the wembedding_service directory empty or non-empty in your repository? If it is empty, you need to run for example git submodule update --init.

Cheers!

Shasetty commented 8 months ago

I have downloaded the model and was able to make the server run on port 8001 using command : python3 udpipe2_server.py 8001 --logfile udpipe2_server.log --threads=4 english english-ewt-ud-2.10-220711:en_ewt-ud-2.10-220711:eng:en models-2.10/en_all-ud-2.10-220711.model en_ewt https://ufal.mff.cuni.cz/udpipe/2/models#universal_dependencies_210_models

when I hit localhost:8001 from browser I get: No handler for the given URL '/'

tried running udpipe2_client.py using command python3 udpipe2_client.py I got no output.

how do I set up both server and client locally?

installed python packages are below

Package Version


absl-py 2.0.0 astunparse 1.6.3 cachetools 5.3.2 certifi 2023.11.17 charset-normalizer 3.3.2 filelock 3.13.1 flatbuffers 23.5.26 fsspec 2023.12.2 gast 0.4.0 google-auth 2.26.1 google-auth-oauthlib 0.4.6 google-pasta 0.2.0 grpcio 1.60.0 h5py 3.10.0 huggingface-hub 0.20.2 idna 3.6 keras 2.10.0 Keras-Preprocessing 1.1.2 libclang 16.0.6 Markdown 3.5.1 MarkupSafe 2.1.3 numpy 1.26.3 oauthlib 3.2.2 opt-einsum 3.3.0 packaging 23.2 pip 22.0.2 protobuf 3.19.6 pyasn1 0.5.1 pyasn1-modules 0.3.0 PyYAML 6.0.1 regex 2023.12.25 requests 2.31.0 requests-oauthlib 1.3.1 rsa 4.9 safetensors 0.4.1 setuptools 59.6.0 six 1.16.0 tensorboard 2.10.1 tensorboard-data-server 0.6.1 tensorboard-plugin-wit 1.8.1 tensorflow-estimator 2.10.0 tensorflow-gpu 2.10.1 tensorflow-io-gcs-filesystem 0.35.0 termcolor 2.4.0 tokenizers 0.15.0 tqdm 4.66.1 transformers 4.36.2 typing_extensions 4.9.0 ufal.chu-liu-edmonds 1.0.2 ufal.udpipe 1.3.1.1 urllib3 2.1.0 Werkzeug 3.0.1 wheel 0.42.0 wrapt 1.16.0

foxik commented 8 months ago

When your server is running, you can try running

Cheers!

Shasetty commented 8 months ago

Installed server, its working fine.

Thank you

sumanthk07 commented 6 months ago

I have set up server using command python3 udpipe2_server.py 8001 --logfile udpipe2_server.log --threads=4 english english-ewt-ud-2.10-220711:en_ewt-ud-2.10-220711:eng:en models-2.10/en_all-ud-2.10-220711.model en_ewt https://ufal.mff.cuni.cz/udpipe/2/models#universal_dependencies_210_models

Also when I run python3 udpipe2_client.py --service=http://localhost:8001/ --list_models I am getting following output as expected.

english-ewt-ud-2.10-220711
Default model: english 

however when i try to run client with python3 udpipe2_client.py --service=http://localhost:8001/ --model=english --tokenizer= --tagger= --parser= input_file.txt I am encountering following error

An exception was raised during UDPipe 'process' REST request. The service returned the following error: An internal error occurred during processing. Traceback (most recent call last): File "/home/sumanth.k/udpipe/udpipe2_client.py", line 122, in <module> (outfile or sys.stdout).write(process(args, data)) File "/home/sumanth.k/udpipe/udpipe2_client.py", line 77, in process response = perform_request(args.service, "process", data) File "/home/sumanth.k/udpipe/udpipe2_client.py", line 42, in perform_request with urllib.request.urlopen(urllib.request.Request( File "/usr/lib/python3.10/urllib/request.py", line 216, in urlopen return opener.open(url, data, timeout) File "/usr/lib/python3.10/urllib/request.py", line 525, in open response = meth(req, response) File "/usr/lib/python3.10/urllib/request.py", line 634, in http_response response = self.parent.error( File "/usr/lib/python3.10/urllib/request.py", line 563, in error return self._call_chain(*args) File "/usr/lib/python3.10/urllib/request.py", line 496, in _call_chain result = func(*args) File "/usr/lib/python3.10/urllib/request.py", line 643, in http_error_default raise HTTPError(req.full_url, code, msg, hdrs, fp) urllib.error.HTTPError: HTTP Error 400: Bad Reques

installed packages are: `Package Version


absl-py 2.1.0 astunparse 1.6.3 cachetools 5.3.3 certifi 2024.2.2 charset-normalizer 3.3.2 flatbuffers 24.3.7 gast 0.4.0 google-auth 2.29.0 google-auth-oauthlib 0.4.6 google-pasta 0.2.0 grpcio 1.62.1 h5py 3.10.0 idna 3.6 keras 2.10.0 Keras-Preprocessing 1.1.2 libclang 18.1.1 Markdown 3.6 MarkupSafe 2.1.5 numpy 1.26.4 oauthlib 3.2.2 opt-einsum 3.3.0 packaging 24.0 pip 22.0.2 protobuf 3.19.6 pyasn1 0.5.1 pyasn1-modules 0.3.0 requests 2.31.0 requests-oauthlib 2.0.0 rsa 4.9 setuptools 59.6.0 six 1.16.0 tensorboard 2.10.1 tensorboard-data-server 0.6.1 tensorboard-plugin-wit 1.8.1 tensorflow-estimator 2.10.0 tensorflow-gpu 2.10.1 tensorflow-io-gcs-filesystem 0.36.0 termcolor 2.4.0 typing_extensions 4.10.0 ufal.chu_liu_edmonds 1.0.3 ufal.udpipe 1.3.1.1 urllib3 2.2.1 Werkzeug 3.0.1 wheel 0.43.0 wrapt 1.16.0`

Could you please assist me setting up the server and using it locally.

foxik commented 6 months ago

The stack trace is from the udpipe2_client.py, which is nowever not very informative.

What does the udpipe2_server.log look like after the request?

sumanthk07 commented 6 months ago

The stack trace is from the udpipe2_client.py, which is nowever not very informative.

What does the udpipe2_server.log look like after the request?

I checked udpipe2_server.log and understood a module was missing, after installing it is working as expected.

Thank you.