Closed Chandrak1907 closed 1 month ago
Use dspy.LM and give it a model name in accordance to the LiteLLM model names
Thank you @okhat , I already did that. Here is the gist file https://gist.github.com/Chandrak1907/bfe6d8dc71f173882bc6caf3412b74d6 with an attempt of the code to reproduce this blog https://drchrislevy.github.io/posts/dspy/dspy.html
Below is the further exploration that I've done
In '~/miniconda3/envs/py310/lib/python3.10/site-packages/dspy/predict/predict.py', commented lines 107,108 and 109 and hard-coded temperature and num_generations
temperature = 0
num_generations = 1
This results in answer that produces characters from the response
... completions ... [{'answer': '('}, {'answer': 'C'}, ...]
... pred ... Prediction(
answer='(',
completions=Completions(...)
) (1021 completions omitted)
This could be traced to
line 74 in /miniconda3/envs/py311/lib/python3.11/site-packages/dsp/primitives/predict.py:
completions: list[Example] = [template.extract(example, p) for p in completions]
@Chandrak1907 No I'm suggesting dspy.LM, e.g.
model_name = "provider_name/model_name"
lm = dspy.LM(model_name, api_base="....", api_key="....")
dspy.configure(lm=lm)
I did this and it worked for me, inside the init function
self.kwargs = kwargs
I'm trying to use DSPY with Oracle Cloud's Gen AI platform. API is here -- https://docs.oracle.com/en-us/iaas/api/#/en/generative-ai-inference/20231130/GenerateTextResult/GenerateText. Below is the code defining LM in dspy
Sending a prompt to LLM works fine
However, below code for running chain of thought fails
And, I get below errror