Mirascope / mirascope

LLM abstractions that aren't obstructions
https://docs.mirascope.io/
MIT License
651 stars 35 forks source link

Add support for HuggingFace models (i.e. HuggingFaceCall) #215

Open brenkao opened 4 months ago

brenkao commented 4 months ago

Is your feature request related to a problem? Please describe. While HuggingFace is already supported through the OpenAICall class (use base_url, model, api_key="-"), we should support HuggingFace InferenceClient through a HuggingFaceCall class.

Describe the solution you'd like Create a HugginFaceCall that extends BaseCall that follows same principles as other Calls. This means that call params are properly typed in a Pydantic model, responses extend BaseCallResponse, etc.

Additional context See OpenAICall for reference. https://github.com/Mirascope/mirascope/blob/dev/mirascope/openai/calls.py

willbakst commented 4 months ago

The main thing on my mind for this feature is how this will interact with our prompt_template parsing. Will users want direct access to the underlying model prompt structure?

e.g. https://llama.meta.com/docs/model-cards-and-prompt-formats/meta-llama-3/

willbakst commented 3 months ago

Likely worth looking into how we can propagate the logits of the output as well