Closed arthurgreef closed 1 month ago
Thanks @arthurgreef ! What is the serialize exception? Basically what's the error?
You can also get a much more complete error message by doing:
import litellm
litellm.set_verbose=True # 👈 this is the 1-line change you need to make
Also what's in os.environ["AZURE_AI_MODEL_GPT_4o"]
? It should probably start with azure/
or something like that for LiteLLM.
I took another look. This works great. Thanks.
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
azure_ad_token = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)()
model=os.environ["AZURE_AI_MODEL_GPT_4o"]
llm_gpt_4o = dspy.LM(
model="azure/"+os.environ["AZURE_AI_MODEL_GPT_4o"],
api_version=os.environ["OPENAI_API_VERSION"],
api_base=os.environ["AZURE_OPENAI_ENDPOINT"],
azure_ad_token=azure_ad_token,
max_tokens=3000,
)
dspy.configure(lm=llm_gpt_4o)
I am now having a problem with tracing in Phoenix using the adaptors. Is there something that needs to be updated in Pheonix. Here is the code that works with the AzureOpenAI client.
# Needs to be before dspy import to get token tallies
os.environ["DSP_CACHEBOOL"] = "False"
import dspy
from openinference.instrumentation.dspy import DSPyInstrumentor
from openinference.instrumentation.openai import OpenAIInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from phoenix.otel import register
endpoint="http://127.0.0.1:6006/v1/traces"
register(endpoint=endpoint)
resource = Resource(attributes={})
tracer_provider = trace_sdk.TracerProvider(resource=resource)
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace_api.set_tracer_provider(tracer_provider=tracer_provider)
DSPyInstrumentor().instrument()
OpenAIInstrumentor().instrument()
Yeah, the design of tracing (Phoenix and LangFuse) needs to change since the library is different internally for dspy.LM (DSPy 2.5 onwards).
By the way, you may also want to set:
os.environ["DSPY_CACHEBOOL"] = "False" # notice the Y, for 2.5 clients
Requested a new feature to support DSPy 2.5
.
Really appreciate your initiative on this @arthurgreef !
This has been fixed https://github.com/Arize-ai/openinference/issues/1050
A warning about using the
AzureOpenAI
is displayed in version2.5.2
. The recommendation is to usedspy.LM
. I get the serialize exception when I change this the client from this:to this: