mukulpatnaik / researchgpt

A LLM based research assistant that allows you to have a conversation with a research paper
https://www.dara.chat
MIT License
3.55k stars 340 forks source link

ChatGPT API #28

Closed goldengrape closed 1 year ago

goldengrape commented 1 year ago

I tried:

r = openai.ChatCompletion.create(
            model="gpt-3.5-turbo", 
            messages=prompt, 
            temperature=0.4, 
            max_tokens=1500)

in main-local.py but doesn't work

magicgh commented 1 year ago

prompt should be rewritten as an array of dictionaries. Example:

prompt=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Who won the world series in 2020?"},
        {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
        {"role": "user", "content": "Where was it played?"}
    ]
goldengrape commented 1 year ago

@magicgh

cool! this works:

        messages=[
            {
            "role": "user",
            "content": prompt
            }
        ]
        r = openai.ChatCompletion.create(
            model="gpt-3.5-turbo", 
            messages=messages, 
            temperature=0.4, 
            max_tokens=1500)

        answer=(
                r["choices"][0]
                .get("message")
                .get("content")
                .encode("utf8")
                .decode()
            )
MrPeterJin commented 1 year ago

@magicgh

cool! this works:

        messages=[
            {
            "role": "user",
            "content": prompt
            }
        ]
        r = openai.ChatCompletion.create(
            model="gpt-3.5-turbo", 
            messages=messages, 
            temperature=0.4, 
            max_tokens=1500)

        answer=(
                r["choices"][0]
                .get("message")
                .get("content")
                .encode("utf8")
                .decode()
            )

Hi, I have implemented this in my fork. You can have a try!