run-llama / rags

Build ChatGPT over your data, all with natural language
MIT License
6.17k stars 629 forks source link

Removing Dependence on secrets.toml for environment variables #54

Open nickknyc opened 8 months ago

nickknyc commented 8 months ago

Per https://github.com/run-llama/rags/issues/51 I have removed the use of st.secrets and thus the need for using a secrets.toml file.

Why The secrets.toml is problematic when deploying to hosting other that streamlit/snowflake. In my case deploying to Azure Application Services resulted in an exposed secrets.toml. This resulted in OpenAI automagically deleting the exposed key. Which in turn made my rags deployment fail at runtime.

What This PR has removed all of the uses of st.secrets and the secrets.toml file and instead sets up use of good old os.getenv()