Open youngchanpark opened 1 year ago
I think you might have tried this: https://huggingface.co/spaces/bhaskartripathi/pdfChatter?docker=true Looking at the error, i feel you can the following:
I think you might have tried this: https://huggingface.co/spaces/bhaskartripathi/pdfChatter?docker=true Looking at the error, i feel you can the following:
- Your_Key_Here="YOUR_VALUE_HERE" (This is not expired)
- When you save the image, use this instruction Docker save --output=C:\YOUR_PATH\my_docker_image.tar aa9e20aea25a(image id)
- When you load the image, try this : Docker load --input C:\YOUR_PATH\my_docker_image.tar
I am also getting the same error and am unable to run it locally. I have also tried installing all of the required libraries manually and executing the app.py and api.py manually (python3 app.py
) and (lc-serve deploy local api)
, however, the API seems to not work properly. In particular, the endpoint seems to not be registered, as opening the Swagger docs of the api only shows the default API routes and not the ask_url nor the ask_file endpoint. Therefore, when asking a question using the app, only a detail, not found error is returned from the endpoint.
@deepankarm Plz help.
Hey @timothydillan, can you please share the following information to help in debugging?
python --version
lc-serve -v
lc-serve deploy ...
command from? And what's content inside that directory? ls -l
should help.Hey @timothydillan, can you please share the following information to help in debugging?
- OS
- Python version
python --version
- langchain-serve version
lc-serve -v
- which directory are you executing the
lc-serve deploy ...
command from? And what's content inside that directory?ls -l
should help.
Hey @deepankarm and @bhaskatripathi,
I am on the latest version of macOS (Ventura 13.3.1). My Python version is 3.8.8, langchain-serve's version is on 0.0.22, and I am running lc-serve deploy on the project's (pdfGPT) directory. For now, as a temporary workaround, I was able to use the app properly by changing the langchain-serve implementation of the REST API to using FastAPI instead.
also getting this issue. I'm trying to set it up trough docker and it keeps on getting stuck on the same part. running popos! on thinkpad x395
Hi all,
Instead of running pdfGPT from the container, I did the following to just run it on my host:
requirements.txt
Universal Sentence Encoder
locally and replaced the code in api.py
file as instructed in the READMEload_openai_key
function (not bothered to export my api key everytime.)lc-serve deploy local api
in one terminalpython3 app.py
in another terminalHope this helps
@youngchanpark any idea about this warning when I run python3 app.py
2023-07-03 18:57:49.477996: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2023-07-03 18:57:49.863083: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2023-07-03 18:57:49.864075: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
`2023-07-03 18:57:50.859284: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT`
Hi, I ran the
docker pull
command as suggested in theREADME
but I get the following output.Is there maybe something wrong with the
aa9e20aea25a
layer?