WhatTheFuzz / binaryninja-openai

Integrates OpenAI with BinaryNinja via a plugin.
MIT License
68 stars 6 forks source link

Update to also support gpt-3.5-turbo and gpt-4 and gpt-4-32k #24

Closed owah closed 1 year ago

owah commented 1 year ago

Hi,

thanks for this addon: I was just wondering if it'd make sense to add support for gpt-3.5? According to OpenAIs website gpt-3.5-turbo is 1/10th of the cost of text-davinci-003 and the coming gpt-4 will support up to 32k tokens.

WhatTheFuzz commented 1 year ago

Yep, it totally does. I meant to get to it last week. I'm on vacation currently, but I'll get around to it next week.

I'd like to actually just use the API to get the models dynamically so future updates aren't needed.

Thanks for raising an issue.

WhatTheFuzz commented 1 year ago

Added the new models on a feature branch. I am away from my primary device where I have Binja installed. I hope to test it and cut a release this weekend.

owah commented 1 year ago

Hey,

I did some testing: gpt-4 and 3.5-turbo don't work:

openai.error.InvalidRequestError: This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?

Which is somewhat unexpected, because using gpt-4-32k works, even though v1/completions shouldn't support it.

WhatTheFuzz commented 1 year ago

It looks like their docs do address this. Shouldn't be a difficult fix. I'll take a look. Thanks for reporting the issue.

WhatTheFuzz commented 1 year ago

Implemented by @pilvar222 in #27. It's been merged into main. Let me know if you run into any issues! Sorry it took so long.