[X] I added a very descriptive title to this issue.
[X] I searched the LangChain documentation with the integrated search.
[X] I used the GitHub search to find a similar question and didn't find it.
[X] I am sure that this is a bug in LangChain rather than my code.
[X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
Code:
from langchain.chat_models import AzureChatOpenAI
from langchain_community.callbacks import get_openai_callback
# Initialize the LLM
llm = AzureChatOpenAI(
temperature=0,
model="gpt-4o-2024-08-06",
)
# Use the callback to track token usage and costs
with get_openai_callback() as cb:
entity_response = _process_entity_extraction(llm, prompt, content, document_type)
validation_response = _process_validation(llm, prompt, entity_response, document_type)
page_info_response = _process_page_info(llm, prompt, content)
final_response = validation_response | page_info_response
final_response = _normalize_response(final_response)
print(f"Input tokens: {cb.total_tokens}, Cost: ${cb.total_cost}")
Error Message and Stack Trace (if applicable)
No response
Description
Description:
When using the AzureChatOpenAI integration with LangChain, I observe a discrepancy in cost calculations compared to ChatOpenAI. While token usage is nearly identical between the two integrations, the reported cost for AzureChatOpenAI is approximately double.
Comparison of Logs:
ChatOpenAI:
Input tokens: 11616
Output tokens: 723
Total cost: $0.036
AzureChatOpenAI:
Input tokens: 11618
Output tokens: 760
Total cost: $0.069
Expected Behavior:
Costs should be consistent for the same model (gpt-4o-2024-08-06) across both integrations, as the pricing tables for OpenAI and Azure align.
Observed Behavior:
The cost reported for AzureChatOpenAI is significantly higher, despite negligible differences in token usage.
Checked other resources
Example Code
Code:
Error Message and Stack Trace (if applicable)
No response
Description
Description:
When using the
AzureChatOpenAI
integration with LangChain, I observe a discrepancy in cost calculations compared toChatOpenAI
. While token usage is nearly identical between the two integrations, the reported cost forAzureChatOpenAI
is approximately double.Comparison of Logs:
ChatOpenAI:
AzureChatOpenAI:
Expected Behavior:
Costs should be consistent for the same model (
gpt-4o-2024-08-06
) across both integrations, as the pricing tables for OpenAI and Azure align.Observed Behavior:
The cost reported for
AzureChatOpenAI
is significantly higher, despite negligible differences in token usage.System Info
System Info:
langchain
: 0.2.16langchain-community
: 0.2.17langchain-core
: 0.2.41langchain-openai
: 0.1.25langchain-text-splitters
: 0.2.4langsmith
: 0.1.131gpt-4o-2024-08-06
langchain_community.callbacks.get_openai_callback
Additional Context: