using the litellm openai adapter to override base url and simply connecting creates a large dump of LiteLLM error messages, rendering the terminal useless on large programs
setup an open ai compatible endpoint
set your API_BASE point
use the 'openai/MODELNAME' syntax
simple app
PORT = os.getenv('PORT', 6002)
API_BASE = f"http://localhost:{PORT}/v1/"
MODEL_NAME = "openai/unsloth/gemma-2-27b-it-bnb-4bit"
logging.getLogger("dspy").setLevel(logging.ERROR) # doesn't work
logging.getLogger("dspy").propagate = False # doesn't work
lm = dspy.LM(MODEL_NAME, api_key="..", api_base=API_BASE)
dspy.configure(lm=lm)
lm("What is 2+2?", temperature=0.9)
using the litellm openai adapter to override base url and simply connecting creates a large dump of LiteLLM error messages, rendering the terminal useless on large programs
simple app
output