patterns-ai-core / langchainrb

Build LLM-powered applications in Ruby
https://rubydoc.info/gems/langchainrb
MIT License
1.42k stars 195 forks source link

Anthropic LLM swallows API errors #697

Closed Jbrito6492 closed 4 months ago

Jbrito6492 commented 4 months ago

Describe the bug When the completions method is called from the AnthropicResponse object, it does not check for an error in the response. For this reason, it will return nil, and we cannot see the actual API error.

To Reproduce

  1. Have 0 credits in your Anthropic account
  2. Try to use the client.

I was able to reproduce this with the following lines of code (specification text is a local file in a project):

anthropic = Langchain::LLM::Anthropic.new(api_key: ENV["ANTHROPIC_API_KEY"])
chunker = Langchain::Chunker::Semantic.new(specification_text, llm: anthropic).chunks

Expected behavior I expect to see the error I am getting from the API so I can fix the issue instead of an obfuscated error.

Terminal commands & output Commands you used and the terminal output.

Screenshots Error from my downstream Rails app: image

Actual error from setting a debugger:

image

The line of code that is returning nil instead of the error is highlighted below:

image

Desktop (please complete the following information): Macbook Pro

Additional context Add any other context about the problem here.