Cocoon-Data-Transformation / cocoon

MIT License
65 stars 8 forks source link

Feature Request Thread #1

Open zachary62 opened 6 months ago

zachary62 commented 6 months ago

😊 Please submit and upvote features you want

brsolo commented 2 months ago

Would be nice to have easier support for all OpenAI models (or mainly the latest gpt-4-turbo and gpt-3.5-turbo). Right now to use a different OpenAI model, I manually change the model parameter in init (below). I had tried to just make another elif statement, but _openai.apitype is used in util.py within the openai library, so this raises an error.

One quick fix is to add the elif statement below, but it seems like bad form redefining an environ variable.

Maybe a better solution is to have two separate environ variables. One for _openai.apitype and one for model.

Current code:

    elif openai.api_type == 'open_ai':

        response = openai.ChatCompletion.create(
            model="gpt-4-1106-preview",
            temperature=temperature,
            top_p=top_p,
            messages=messages
        )

Potential quick fix:

   elif openai.api_type == 'gpt-3.5-turbo-0125':

        os.environ['OPENAI_API_TYPE'] = 'open_ai'

        response = openai.ChatCompletion.create(
            model="gpt-4-1106-preview",
            temperature=temperature,
            top_p=top_p,
            messages=messages
        )