Currently, llms.watsonx does not support asynchronous processing, which is crucial for the efficient operation of evaluators such as the FaithfulnessEvaluator and RelevancyEvaluator. In comparison llms.ollama/llms.openai do support asynchronous processing, enabling smoother and faster evaluation within pipelines.
Reason
The classes:
acomplete
astream_chat
achat
astream_complete
currently marked as NotImplemented for llms.watsonx.
Value of Feature
These evaluators are essential for ensuring the quality of responses.
Feature Description
Currently, llms.watsonx does not support asynchronous processing, which is crucial for the efficient operation of evaluators such as the FaithfulnessEvaluator and RelevancyEvaluator. In comparison llms.ollama/llms.openai do support asynchronous processing, enabling smoother and faster evaluation within pipelines.
Reason
The classes:
currently marked as NotImplemented for llms.watsonx.
Value of Feature
These evaluators are essential for ensuring the quality of responses.