Closed PMLS3 closed 8 months ago
Hey @PMLS3 , thanks for asking, The OPENAI_API_KEY should be in you env variables, you can use this command or put it in your .bashrc file
export OPENAI_API_KEY={{key}}
If you cant do this , you also could initialize the llm like this
let open_ai = OpenAI::new(options)
.with_model(OpenAIModel::Gpt35)
.with_api_key("your key here");
If you want to use a .env file i think there are some crates you could use , and get the env variable of the key , and then pass it to with_api_key function
you are a legend! Thanks
Thanks 😄, @PMLS3, If you want you could put it in the Readme file,, as a pre requisit, and create a PR.
export OPENAI_API_KEY={{key}}
or use the with_key
let open_ai = OpenAI::new(options)
.with_model(OpenAIModel::Gpt35)
.with_api_key(""); // You can change the model as needed
Permission to Abraxas-365/langchain-rust.git denied to PMLS3
@PMLS3 Thats weird, are you trying to push directly? you should create a Pull Request, pls let me know
hi, i have tried initializing model with code you mentioned above, it is giving following error
error[E0423]: expected value, found module options
, how can i solve this ?
hi, i have tried initializing model with code you mentioned above, it is giving following error
error[E0423]: expected value, found module
options
, how can i solve this ?
Hey @DravidVaishnav that was the old way of doing it.
You could check the examples in the readme
Here is one of them
https://github.com/Abraxas-365/langchain-rust/blob/main/examples/llm_openai.rs
hi, i have tried initializing model with code you mentioned above, it is giving following error error[E0423]: expected value, found module
options
, how can i solve this ?Hey @DravidVaishnav that was the old way of doing it.
You could check the examples in the readme
Here is one of them
https://github.com/Abraxas-365/langchain-rust/blob/main/examples/llm_openai.rs
Hey @Abraxas-365 thanks for replying, i have tried example you mentioned, it is giving error of You didnt provide an API key
, i have already set OPENAI_API_KEY
in environment variables, still it is giving error. Is there anyway i can solve this, or give the key in code itself ?
hi, i have tried initializing model with code you mentioned above, it is giving following error
error[E0423]: expected value, found module
options
, how can i solve this ?Hey @DravidVaishnav that was the old way of doing it.
You could check the examples in the readme
Here is one of them
https://github.com/Abraxas-365/langchain-rust/blob/main/examples/llm_openai.rs
Hey @Abraxas-365 thanks for replying, i have tried example you mentioned, it is giving error of
You didnt provide an API key
, i have already setOPENAI_API_KEY
in environment variables, still it is giving error. Is there anyway i can solve this, or give the key in code itself ?
Hey @DravidVaishnav, I think you are using a .env file right? The right way of setting env variables is setting it in your .bashrc or .zshrc or typing in the terminal you are using; export OPENAI_API_KEY=yourkey
If you want to set it in code you can use this
let openai = OpenAI::default()
.with_config(
OpenAIConfig::default()
.with_api_key("<your_key>"),
);
Thank you for the exciting project!
I wanted to test the agent and got error:
thread 'main' panicked at src/examples/agent.rs:45:19: Error invoking LLMChain: ApiError(ApiError { message: "You didn't provide an API key. You need to provide your API key in an Authorization header using Bearer auth (i.e. Authorization: Bearer YOUR_KEY), or as the password field (with blank username) if you're accessing the API from your browser and are prompted for a username and password. You can obtain an API key from https://platform.openai.com/account/api-keys.", type: Some("invalid_request_error"), param: None, code: None })
I have my key in a .env file as OPENAI_API_KEY
Is there another way I should've loaded?
Thank you in advance