gorel / discord_dicebot

MIT License
3 stars 5 forks source link

Upgrade openAI stuff #115

Closed jrodal98 closed 7 months ago

jrodal98 commented 7 months ago

Wut

OpenAI deprecated some models and made backwards incompatible changes to their API. Bruh

Unit test

┌─jrodal@jrodal-mbp discord_dicebot on  openai_fixes [?] via 🐍 v3.10.9 (.venv) on ☁️  (us-west-2) took 2s
└─> python -m unittest dicebot.commands.test.test_ask
..
----------------------------------------------------------------------
Ran 2 tests in 0.285s

OK

Manual-ish test

┌─jrodal@jrodal-mbp discord_dicebot on  openai_fixes [?] via 🐍 v3.10.9 (.venv) on ☁️  (us-west-2)
└─> python test.py
Enter prompt: What is a solar eclipse
A solar eclipse occurs when the Moon passes between the Earth and the Sun, blocking out the Sun's light and casting a shadow on the Earth. This can result in a partial or total blocking of the Sun, creating a dramatic darkening of the sky during the day. Solar eclipses are rare events that can be seen from specific locations on Earth, and they can have a significant impact on the surrounding environment and wildlife.

Used this script

import asyncio
import os
import aiohttp
import logging

# [https://platform.openai.com/docs/api-reference/chat/create](https://platform.openai.com/docs/api-reference/chat/create)
_URL = "https://api.openai.com/v1/chat/completions"
_DEFAULT_MODEL = "gpt-3.5-turbo"
_DEFAULT_MAX_TOKENS = 2048
_DEFAULT_TEMPERATURE = 0.5
_SECRET = os.getenv("OPENAI_API_KEY")

prompt = input("Enter prompt: ")

async def ask_openai(prompt: str) -> str:
    """Ask a question to openai"""
    async with aiohttp.ClientSession() as session:
        async with session.post(
            _URL,
            headers={
                "Content-Type": "application/json",
                "Authorization": f"Bearer {_SECRET}",
            },
            json={
                "messages": [{"role": "user", "content": prompt}],
                "model": _DEFAULT_MODEL,
                "max_tokens": _DEFAULT_MAX_TOKENS,
                "temperature": _DEFAULT_TEMPERATURE,
            },
        ) as response:
            json_resp = await response.json()
            logging.info(f"JSON response from model: {json_resp}")
            if "choices" in json_resp:
                return json_resp["choices"][0]["message"]["content"].strip()
            else:
                return json_resp["error"]["message"]

print(asyncio.run(ask_openai(prompt)))