Open jielabs opened 1 year ago
There's not really any problem in principle, but it's a bit annoying to do because OpenAI changed the interface for ChatGPT and GPT4 (where the input is segmented into messages with the user and assistant, as opposed to being able to prompt however you want); additionally text-davinci-002 let you specify suffixes in addition to prefixes, which the later models don't. Also, due to the API limitations, you wouldn't be able to use the detailed controller, so you'd have to turn that off. It's quite possible that switching OPT to ChatGPT or GPT4 makes up for the loss, though.
So the overall structure of generation would probably work similarly, but using ChatGPT or GPT4 would probably require going through and replacing all the places where we call davinci or text-davinci-002, especially in https://github.com/yangkevin2/doc-story-generation/blob/main/story_generation/common/summarizer/models/gpt3_summarizer.py , and then tweaking the prompts as necessary; it might take a bit of manual experimentation.
I am wondering if it is possible, and how, we can make it support GPT4 or 3.5?
Hi,have you implemented it with GPT-4?
Hi, please see https://github.com/facebookresearch/doc-storygen-v2. It should be easier to directly swap that LM API (it should already be usable with 3.5) to GPT4.
I am wondering if it is possible, and how, we can make it support GPT4 or 3.5?