Closed sainisanjay closed 11 months ago
Yeah, of course, you can run the Streamlit app running in local. You might need .env file with the variables needed. Will upload an example .env file and update here in this thread. You'll also need an Cohere/Open AI API key and Pinecone DB API key
Thanks for your quick response! Please update .env file. One more question if we change the models open source LLama-2 then what are the changes required?
Just Added the example.env file in the commit 36c0dc6
You need to change the classes for retrieving the answers. E.g. using HF Endpoint or SageMaker Endpoints. Will take a look and tell you which lines do you need to change.
@axiom-of-choice Thanks you so much for your quick support! I will try to do same as per your suggestions mean time if you get a time for open source models integration code please indicate me.
@axiom-of-choice I added the OpenAI key in the .env file but still app getting below error:
~/workspace/LLM-Chatbot$ streamlit run streamlit_app.py
You can now view your Streamlit app in your browser.
Local URL: http://localhost:8501
Network URL: http://192.168.237.129:8501
/home/saini/workspace/LLM-Chatbot
2023-10-25 10:04:17.268 Uncaught app exception
Traceback (most recent call last):
File "/home/saini/anaconda3/envs/ml/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 541, in _run_script
exec(code, module.__dict__)
File "/home/saini/workspace/LLM-Chatbot/streamlit_app.py", line 15, in <module>
from config import *
File "/home/saini/workspace/LLM-Chatbot/config/__init__.py", line 2, in <module>
from .config import *
File "/home/saini/workspace/LLM-Chatbot/config/config.py", line 61, in <module>
"GPT 3.5 turbo": ChatOpenAI(openai_api_key=OPENAI_API_KEY, model_name="gpt-3.5-turbo", temperature=0.0),
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ChatOpenAI
__root__
Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)
Added OpenAI Key:
Please check that your file name is .env and not example.env (by default load_dotenv() function reads .env file)
Thanks @axiom-of-choice I am able to run the app. Just please let me know how to add open source models like LLama-2 instead of ChatGPT.
Yeah could you please open another issue in this repo so I can follow up there without combining questions? I will code some examples using HF endpoints and upload it maybe tomorrow.
Yeah, of course, you can run the Streamlit app running in local. You might need .env file with the variables needed. Will upload an example .env file and update here in this thread. You'll also need an Cohere/Open AI API key and Pinecone DB API key