flowersteam / lamorel

Lamorel is a Python library designed for RL practitioners eager to use Large Language Models (LLMs).
MIT License
176 stars 15 forks source link

Using API in lamorel #30

Closed nuomizai closed 6 months ago

nuomizai commented 6 months ago

Is it possible to use GPT3/GPT4 API in lamorel? For example, in the saycan example, how can I use API instead of a local LLM model?

ClementRomac commented 6 months ago

Hi,

It is not currently possible. Lamorel could be modified to add a new LLM type (i.e. API) but most of Lamorel's features (distributed training, initializers, custom functions...) wouldn't be possible.

Lamorel was designed for high-scale local LLM usage. Similar tools exist for API (e.g. https://github.com/openai/openai-cookbook/blob/main/examples/api_request_parallel_processor.py).

So I would advise having a piece of code that either uses Lamorel's server or OpenAI's Python library given the asked LLM.

nuomizai commented 6 months ago

Hi,

It is not currently possible. Lamorel could be modified to add a new LLM type (i.e. API) but most of Lamorel's features (distributed training, initializers, custom functions...) wouldn't be possible.

Lamorel was designed for high-scale local LLM usage. Similar tools exist for API (e.g. https://github.com/openai/openai-cookbook/blob/main/examples/api_request_parallel_processor.py).

So I would advise having a piece of code that either uses Lamorel's server or OpenAI's Python library given the asked LLM.

Thanks for your reply.