Open abetlen opened 11 months ago
I would like to contribute updates to llama-cpp-python examples so that they all work with the updated v1 OpenAI API. Should I associate the PR with this issue, or some other issue related to examples?
@jperiodlangley yes this issue would be great, if you take a similar approach to the create_chat_completion_openai_v1
function that'll be easy to merge.
With the update to v1 OpenAI's API changed significantly, while backwards compatibility was straightforward to preserve on the server the python API is lagging.
The main difference in the pre and pos-v1 APIs is that OpenAI no longer returns dictionary-like objects from and instead fields are accessed directly.
My intention is two preserve the current api of the
Llama
class and not introduce any breaking changes however it would be nice to support the new API in such a way that OpenAI applications can swap out and use a local llama.cpp model.