Closed nuomizai closed 6 months ago
Hi,
It is not currently possible. Lamorel could be modified to add a new LLM type (i.e. API) but most of Lamorel's features (distributed training, initializers, custom functions...) wouldn't be possible.
Lamorel was designed for high-scale local LLM usage. Similar tools exist for API (e.g. https://github.com/openai/openai-cookbook/blob/main/examples/api_request_parallel_processor.py).
So I would advise having a piece of code that either uses Lamorel's server or OpenAI's Python library given the asked LLM.
Hi,
It is not currently possible. Lamorel could be modified to add a new LLM type (i.e. API) but most of Lamorel's features (distributed training, initializers, custom functions...) wouldn't be possible.
Lamorel was designed for high-scale local LLM usage. Similar tools exist for API (e.g. https://github.com/openai/openai-cookbook/blob/main/examples/api_request_parallel_processor.py).
So I would advise having a piece of code that either uses Lamorel's server or OpenAI's Python library given the asked LLM.
Thanks for your reply.
Is it possible to use GPT3/GPT4 API in lamorel? For example, in the saycan example, how can I use API instead of a local LLM model?