Closed 912100012 closed 3 weeks ago
To specify the is_function_calling_model
property for a model like DASHSCOPE, you can create a custom LLM class that overrides the metadata. Here's how you can do it:
Create a Custom LLM Class:
Define a class that inherits from the appropriate base class and override the metadata
property to set is_function_calling_model
to True
.
from llama_index.core.llms import CustomLLM, LLMMetadata
class CustomDashscopeLLM(CustomLLM):
@property
def metadata(self) -> LLMMetadata:
return LLMMetadata(
is_function_calling_model=True,
# Add other metadata properties as needed
)
Use the Custom LLM:
Set your custom LLM in the Settings
to ensure it is used throughout your application.
from llama_index.core import Settings
Settings.llm = CustomDashscopeLLM()
This approach allows you to customize the metadata for your specific model, enabling function calling support [1].
To continue talking to Dosu, mention @dosu.
@912100012 the LLM needs to implement specific function calling logic (i.e subclassing FunctionCallingLLLM, implementing a few new methods)
It is not as simple as flipping a metadata value
Question Validation
Question
The llm like DASHSCOPE has the ability of function call,but the package named llms in the llamaindex set the is_function_calling_model of metadata of all the DASHSCOPE model for False default so that it make the error "ValueError: Model name qwen-max does not support function calling API. " So how can i fix this problem such as specifying the is_function_calling_model variables to True.(Attetion ,is_function_calling_model
is read-only decorating by @property)