openai / openai-python

The official Python library for the OpenAI API
https://pypi.org/project/openai/
Apache License 2.0
22.09k stars 3.05k forks source link

Issue with OpenAI API: Incompatibility with Specific Word #472

Closed sky12752 closed 1 year ago

sky12752 commented 1 year ago

Describe the bug

We are using OpenAI api for our application. It’s not giving response (which is a summary of particular text) when text includes “Ideation” word, if we remove Ideation word or makes it lower case (ideation) its giving us response. We are unable to find exact cause of this issue. Can you please help us for the same? We are using TextDavinci003 and Gpt3.5-Turbo models for generating response.

To Reproduce

We are expecting summary with "Ideation" word included in that

Code snippets

def gen_summary(input_text, model, prompt):
    configure_openai(model)
    text_size = len(input_text.split())
    if text_size <=0:
       input_text = prompt

       try: 
            if model != "Turbo35-0301":

                response = openai.Completion.create(
                    engine=model,
                    prompt=f" {prompt} {input_text}",
                    temperature=0,
                    max_tokens=2000,
                    top_p=1,
                    frequency_penalty=0,
                    presence_penalty=0,

                )
                summary = response["choices"][0]["text"]
                chargable_tokens= response["usage"]["prompt_tokens"]
                return summary.replace("\n","").strip(),chargable_tokens
            else:

                response = openai.ChatCompletion.create(
                    engine=model,
                    messages=[{"role": "user", "content": f"{input_text}\n\n {prompt} "}],
                    temperature=0,
                    max_tokens=2000,
                    top_p=1,
                    frequency_penalty=0,
                    presence_penalty=0,

                )
                summary = response["choices"][0]["message"]["content"]
                chargable_tokens= response["usage"]["prompt_tokens"]
                print(summary)
                return summary,chargable_tokens
       except Exception as e:
            return e 
    else:   
     try: 
            if model != "Turbo35-0301":

                response = openai.Completion.create(
                    engine=model,
                    prompt=f"{input_text} {prompt} ",
                    temperature=0,
                    max_tokens=2000,
                    top_p=1,
                    frequency_penalty=0,
                    presence_penalty=0,

                )
                summary = response["choices"][0]["text"]
                chargable_tokens= response["usage"]["prompt_tokens"]
                return summary.replace("\n","").strip(),chargable_tokens 

            else:

                response = openai.ChatCompletion.create(
                    engine=model,
                    messages=[{"role": "user", "content": f" {input_text}\n\n {prompt}"}],
                    temperature=0,
                    max_tokens=2000,
                    top_p=1,
                    frequency_penalty=0,
                    presence_penalty=0,

                )
                summary = response["choices"][0]["message"]["content"]
                print(summary)
                chargable_tokens= response["usage"]["prompt_tokens"]
                return summary,chargable_tokens

     except Exception as e:
            return e

OS

Windows

Python version

Python 3.10

Library version

Open ai 0.27.2

logankilpatrick commented 1 year ago

Not able to replicate, closing this out.