Open wanjeakshay opened 1 week ago
To calculate the total tokens consumption for the evaluation process in Azure Open AI, you can use the total_tokens()
method on the result
object. The parameter you should pass in token_usage_parser
is get_token_usage_for_openai
.
Here is the relevant code:
from ragas import evaluate
from ragas.cost import get_token_usage_for_openai
result_2 = evaluate(
dataset1,
metrics=metrics,
llm=azure_model,
embeddings=azure_embeddings,
token_usage_parser=get_token_usage_for_openai,
)
total_tokens = result_2.total_tokens()
This will give you the total tokens consumed during the evaluation process. The get_token_usage_for_openai
function is used to parse the token usage from the LLM result, ensuring accurate token counting [1][2].
To continue talking to Dosu, mention @dosu.
[ ] I checked the documentation and related resources and couldn't find an answer to my question.
Your Question How to calculate total tokens consumption for evaluating our RAG model? What parameter should I pass in token_usage_parser for Azure Open AI?
Code Examples result_2 = evaluate( dataset1, metrics=metrics, llm=azure_model, embeddings=azure_embeddings, token_usage_parser =model_name )
Additional context Anything else you want to share with us?