stanfordnlp / dspy

DSPy: The framework for programming—not prompting—language models
https://dspy.ai
MIT License
19.27k stars 1.47k forks source link

LiteLLM error message #1619

Closed fullstackwebdev closed 2 weeks ago

fullstackwebdev commented 1 month ago

using the litellm openai adapter to override base url and simply connecting creates a large dump of LiteLLM error messages, rendering the terminal useless on large programs

  1. setup an open ai compatible endpoint
  2. set your API_BASE point
  3. use the 'openai/MODELNAME' syntax

simple app

PORT = os.getenv('PORT', 6002)
API_BASE = f"http://localhost:{PORT}/v1/"
MODEL_NAME = "openai/unsloth/gemma-2-27b-it-bnb-4bit"
logging.getLogger("dspy").setLevel(logging.ERROR) # doesn't work
logging.getLogger("dspy").propagate = False # doesn't work 

lm = dspy.LM(MODEL_NAME, api_key="..", api_base=API_BASE)
dspy.configure(lm=lm)

lm("What is 2+2?", temperature=0.9)

output

Provider List: https://docs.litellm.ai/docs/providers

['2 + 2 = 4']

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

image

arnavsinghvi11 commented 1 month ago

Hi @fullstackwebdev , does the discussion from this LiteLLM issue help? (litellm.suppress_debug_info = True)