tanchongmin / strictjson

A Strict JSON Framework for LLM Outputs
MIT License
198 stars 25 forks source link

AZURE_OPENAI_API_KEY seems not supported, only OPENAI_API_KEY in source #12

Open twilight2001 opened 1 month ago

twilight2001 commented 1 month ago

Hi John and team, thanks for creating the package and I see the advantages but since we're in unsupported countries we must use Azure Openai, I cannot get the LLM function working due to following error, which seems source code only support OPENAI_API_KEY, or if there're other workaround kindly let me know :) THANKS !

ERROR OpenAIError Traceback (most recent call last) Cell In[31], line 1 ----> 1 res = strict_json(system_prompt = 'You are a classifier', 2 user_prompt = 'It is a beautiful and sunny day', 3 output_format = {'Sentiment': 'Type of Sentiment', 4 'Adjectives': 'List of adjectives', 5 'Words': 'Number of words'}) 7 print(res) ....

  File ~\anaconda311\envs\python-31013\lib\site-packages\openai\_client.py:98, in OpenAI.__init__(self, api_key, organization, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation)
 96     api_key = os.environ.get("OPENAI_API_KEY")
 97 if api_key is None:

---> 98 raise OpenAIError( 99 "The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable" 100 ) 101 self.api_key = api_key 103 if organization is None:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

LLM FUNCTOIN CALL def custom_llm(system_prompt: str, user_prompt: str):

ensure your LLM imports are all within this function

# from openai import OpenAI 
from openai import AzureOpenAI  # Import AzureOpenAI for Azure-specific configurations
import os 
import configparser   
import re
import json

# Read the configuration file
config = configparser.ConfigParser()
config.read('config_upgrade_auseast.ini') 

# Read values from the config file
azure_openai_key = config.get('openai', 'api_key')
azure_openai_endpoint = config.get('openai', 'endpoint')

os.environ["AZURE_OPENAI_API_KEY"] = azure_openai_key
os.environ["AZURE_OPENAI_ENDPOINT"] = azure_openai_endpoint

# Create the OpenAI client
client = AzureOpenAI(
  api_key = azure_openai_key, 
  api_version = "2024-02-15-preview",
  azure_endpoint = azure_openai_endpoint
) 

# Make a chat completion request
response = client.chat.completions.create( 
    model="molly-dev-gpt",
    temperature = 0,
    messages=[
        {"role": "system", "content": system_prompt},
        {"role": "user", "content": user_prompt}
    ] 
)
return response.choices[0].message.content
twilight2001 commented 1 month ago

i got it working by adding this chunk of code in base.py and making a custom_llm block, works not pretty because azure does not have model parameter, only deployment name

elif host == 'azure':
    # Get the API key and endpoint from environment variables
    azure_openai_key = os.environ.get("AZURE_OPENAI_API_KEY")
    azure_openai_endpoint = os.environ.get("AZURE_OPENAI_ENDPOINT")

    if not azure_openai_key or not azure_openai_endpoint:
        raise ValueError("AZURE_OPENAI_API_KEY and AZURE_OPENAI_ENDPOINT environment variables must be set.")

    # Create the OpenAIClient instance
    client = AzureOpenAI(
        api_key=azure_openai_key,
        api_version="2024-02-15-preview",
        azure_endpoint=azure_openai_endpoint,
        deployment_name=model   # Replace with your deployment name
    )
tanchongmin commented 1 month ago

Does adding your custom_llm of the form llm = custom_llm into strict_json() directly solve this OpenAI key issue?

Example code (Refer to Tutorial 0)

### Put in your Azure OpenAI keys here ###
AZURE_OPENAI_KEY = ""
AZURE_OPENAI_ENDPOINT = ""

def azureOpenAiChat(
    system_prompt: str, user_prompt: str, model: str = "GPT3_5Turbo", temperature: float = 0, **kwargs
) -> str:

    azure_open_ai_client = AzureOpenAI(
        api_key = AZURE_OPENAI_KEY,
        azure_endpoint = AZURE_OPENAI_ENDPOINT,
        api_version = "2024-02-15-preview"
    )
    response = azure_open_ai_client.chat.completions.create(
        model=model,
        messages=[{"role": "system", "content": system_prompt}, {"role": "user", "content": user_prompt}],
        temperature=temperature,
        **kwargs,
    )

    return response.choices[0].message.content

def azure_chatgpt(system_prompt: str, user_prompt: str) -> str:
    return azureOpenAiChat(system_prompt=system_prompt, user_prompt=user_prompt, model="GPT3_5Turbo", temperature=0)

Run the Custom LLM

res = strict_json(system_prompt = 'You are a classifier',
                    user_prompt = 'It is a beautiful and sunny day',
                    output_format = {'Sentiment': 'Type of Sentiment',
                                    'Adjectives': 'Array of adjectives',
                                    'Words': 'Number of words'},
                                     llm = azure_chatgpt) # set this to your own LLM
print(res)
twilight2001 commented 4 days ago

Dear John, Yes it does, I didn't watch tutorial#0, works great ! cheers GLY