svilupp / PromptingTools.jl

Streamline your life using PromptingTools.jl, the Julia package that simplifies interacting with large language models.
https://svilupp.github.io/PromptingTools.jl/dev/
MIT License
96 stars 9 forks source link

Allow changing OpenAI key after first load #161

Closed PGimenez closed 1 month ago

PGimenez commented 1 month ago

If I'm correct, right now the OpenAI key is set on package load and it is read from the env variable as these lines show:

https://github.com/svilupp/PromptingTools.jl/blob/29d64d61db55856bd41bcb8cc4a73bc20a36c9e7/src/user_preferences.jl#L137-L139

In some cases one might need to switch between keys. For example when building a web UI on top of PT.jl where multiple concurrent users can enter their own keys.

Is there any way to override the global config key or pass it as a parameter when executing a query?

svilupp commented 1 month ago

Good question. I should address it in the FAQ.

So the current hierarchy is:

You can mutate the key at any one of these levels. If you’re using a shared server with multiple users, I’d use a user-specific variable to provide as api_key kwarg.

Does that answer your question?

PGimenez commented 1 month ago

I think I can't do 1 as I'm using the aihelp call in AIHelpMe.jl, but it's good to know you can override the key directly in the function call.

I did try option 2, but it seems the key change is not picked up. This is what I tried in the REPL

julia> AIHelpMe.PT.OPENAI_API_KEY
""

julia> AIHelpMe.PT.OPENAI_API_KEY = "sk-HY----"
WARNING: redefinition of constant PromptingTools.OPENAI_API_KEY. This may fail, cause incorrect answers, or produce other errors.
"sk-HY-----"

julia> AIHelpMe.PT.OPENAI_API_KEY
"sk-HY-----"

julia> aihelp("what is this package about?")
ERROR: ArgumentError: api_key cannot be empty
Stacktrace:
  [1] auth_header(::OpenAI.OpenAIProvider, api_key::String)
    @ OpenAI ~/.julia/packages/OpenAI/d65zV/src/OpenAI.jl:40
  [2] _request(api::String, provider::OpenAI.OpenAIProvider, api_key::String; method::String, query::Nothing, http_kwargs::@NamedTuple{…}, streamcallback::Nothing, additional_headers::Vector{…}, kwargs::@Kwargs{…})
    @ OpenAI ~/.julia/packages/OpenAI/d65zV/src/OpenAI.jl:159
  [3] _request

. . .

julia> aihelp"how to implement quicksort in Julia?"
ERROR: ArgumentError: api_key cannot be empty
Stacktrace:
  [1] auth_header(::OpenAI.OpenAIProvider, api_key::String)
    @ OpenAI ~/.julia/packages/OpenAI/d65zV/src/OpenAI.jl:40
  [2] _request(api::String, provider::OpenAI.OpenAIProvider, api_key::String; method::String, query::Nothing, http_kwargs::@NamedTuple{…}, streamcallback::Nothing, additional_headers::Vector{…}, kwargs::@Kwargs{…})
    @ OpenAI ~/.julia/packages/OpenAI/d65zV/src/OpenAI.jl:159
  [3] _request

edit On the other hand,PT.ai works regardless of what I set the key to as it seems it was already loaded from the env var

 julia> AIHelpMe.PT.OPENAI_API_KEY = ""
WARNING: redefinition of constant PromptingTools.OPENAI_API_KEY. This may fail, cause incorrect answers, or produce other errors.
""

julia> AIHelpMe.PT.ai"And what is the population of it?"
[ Info: Tokens: 51 @ Cost: $0.0001 in 1.1 seconds
PromptingTools.AIMessage("I'm sorry, could you please provide me with more context or specify which city, country, or region you are referring to?")
svilupp commented 1 month ago

You can do 1). The pipeline is fully generic to support any customization - you can provide the nested kwargs like this: https://github.com/svilupp/PromptingTools.jl/blob/29d64d61db55856bd41bcb8cc4a73bc20a36c9e7/src/Experimental/RAGTools/generation.jl#L612 to reach the corresponding RAG steps.

The goal is to give the “api_key” to all steps that call LLM (depends on your configuration, the curse of generality 😥) - for bronze pipeline it would be embedder and answerer steps. You might prefer to use a utility like this https://github.com/svilupp/PromptingTools.jl/blob/29d64d61db55856bd41bcb8cc4a73bc20a36c9e7/src/Experimental/RAGTools/utils.jl#L355

The alternative for more advanced/custom systems to roll their own “aihelp” wrapper where they expose that api_key natively.

Re 2) It must be that Julia remembers the default keyword now for that signature if you don’t provide your api key. That’a probably why ai”” worked - it hasn’t been called before.

PGimenez commented 1 month ago

Alright so I think I got it, thanks! This is what I'll do:

using AIHelpMe
question = "who are you?"
kwargs = AIHelpMe.PT.Experimental.RAGTools.setpropertynested(AIHelpMe.RAG_KWARGS[],[:embedder_kwargs],:api_key,"sk-HY")
AIHelpMe.PT.Experimental.RAGTools.airag(AIHelpMe.RAG_CONFIG[], AIHelpMe.MAIN_INDEX[]; question, return_all = true, kwargs...)