Arize-ai / phoenix

AI Observability & Evaluation
https://docs.arize.com/phoenix
Other
3.66k stars 270 forks source link

[BUG] Error when using llama-index-llms-anthropic from llama index #5025

Open camyoung93 opened 8 hours ago

camyoung93 commented 8 hours ago

Describe the bug When using Llama instrumenter and calling BedrockConverse, all expected spans are picked up but when switching to llama-index-llms-anthropic then LLM spans go missing.

To Reproduce Steps to reproduce the behavior:

  1. Instrument a Llamaindex app that uses Claude via BedrockConverse
  2. Then switch to llama-index-llms-anthropic

Expected behavior LLM spans with Claude are received for either class.

Environment (please complete the following information): openinference-instrumentation-llama-index Version - 3.0.2

Additional context Add any other context about the problem here (e.x. a link to a colab)

mikeldking commented 8 hours ago

Scheduling for next sprint