MagnivOrg / prompt-layer-library

🍰 PromptLayer - Maintain a log of your prompts and OpenAI API requests. Track, debug, and replay old completions.
https://www.promptlayer.com
Apache License 2.0
477 stars 42 forks source link

Anthropic API wrapper issue on count_tokens #60

Closed BedirT closed 9 months ago

BedirT commented 10 months ago

Hey,

I realized today that there is an error on the Anthropic wrapper. It assumes the count_tokens to be a response call (which it isn't). This results in a dashboard error.

Screenshot 2023-08-15 at 1 31 06 PM Screenshot 2023-08-15 at 1 30 56 PM

This also blocks the history view.

BedirT commented 10 months ago

Still not fixed

Jped commented 9 months ago

@BedirT sorry for the wait. Looking into this now.

Jped commented 9 months ago

We decided that this is an issue with our frontend and not our API. Meaning, expected behavior is to send everything that is called using our proxied library to PromptLayer (even though it may not be relevant, but it is hard for us to determine what is and what isnt relevant without having an explicit white list).

With that we just pushed a quick fix to the dashboard, and will have a better UX for this on the frontend coming soon

BedirT commented 9 months ago

I think you should look into it again. The wrapper still looks for the count function and actually makes it extremely heavy. Based on my tests the difference makes the promptlayer unusable with anthropic (since we have to use token counter for the most part)

API Time taken (seconds) (100 samples)
anthropic 0.00220
promptlayer 31.58095

If you are curious, here is the basic code I used for testing:

# Just Anthropic
anthropic = anthropic.Anthropic(
    api_key=os.environ['ANTHROPIC_API_KEY']
)
some_text = "test text with some words"
# time

start = time.time()
for i in range(int(1e2)):
    anthropic.count_tokens(some_text)
end = time.time()
print("Time taken {:.5f} seconds".format(end-start))
# With promptlayer
promptlayer.api_key = os.environ.get("PROMPTLAYER_API_KEY")
anthropic = promptlayer.anthropic
anthropic_api_key = os.environ["ANTHROPIC_API_KEY"]

anthropic = anthropic.Anthropic(
    api_key=anthropic_api_key
)

start = time.time()
for i in range(int(1e2)):
    anthropic.count_tokens(some_text)
end = time.time()
print("Time taken {:.5f} seconds".format(end-start))
Jped commented 9 months ago

@BedirT the time difference is because we are sending the count_tokens response to our api. I will look into just skipping that function call.

Jped commented 9 months ago

@BedirT please upgrade the python library, just pushed a fix