Closed Achraf-Trabelsi closed 1 year ago
Hey, sorry about the late reply. Supporting chat models is easy. All you need to do is implement a new model class in the llm.py file (https://github.com/keirp/automatic_prompt_engineer/blob/main/automatic_prompt_engineer/llm.py), which inherits from the LLM abstract class. You can use GPT_forward class for reference and copy a lot of the code from there.
Hey there, I'd like to know if there will be a release with gpt3.5 model? if not how easy should it be to integrate it in the code?