This CL adds lf.llms.REST and consolidates Anthropic/Groq models based on it.
The motivation are two-folds: a) We need to better iterate the shared code among different REST-based models; b) Allow users to quickly develop new REST-based clients in a few lines of code, without worrying about concurrency and error control.
lf.llms.REST can be used in two ways:
1) Composition:
lf.llms.REST(
api_endpoint='...', # URL to call.
headers=dict(...) # headers for model choice and authentication.
request=lambda prompt, sampling_options: dict(...),
result=lambda json: lf.llms.LMSamplingResult(...),
max_concurrency=8, # Concurrency control.
)
Consolidates REST-based LLMs.
This CL adds
lf.llms.REST
and consolidates Anthropic/Groq models based on it. The motivation are two-folds: a) We need to better iterate the shared code among different REST-based models; b) Allow users to quickly develop new REST-based clients in a few lines of code, without worrying about concurrency and error control.lf.llms.REST
can be used in two ways:1) Composition:
2) Subclassing: