Open aj2620 opened 1 month ago
Hello, @aj2620! I'm here to help you with any bugs, questions, or contributions you may have.
The issue with getting only a few letters of the response when using Langflow with Mistral AI could be related to the max_tokens
parameter. Ensure that the max_tokens
parameter is set to a sufficiently high value to allow for a complete response. Additionally, check the stream
parameter to see if it is set to True
, which might affect how the response is being processed and displayed.
Here is an example configuration for the MistralAIModelComponent
:
from typing import Optional
from langchain_mistralai import ChatMistralAI
from pydantic.v1 import SecretStr
from langflow.custom import CustomComponent
from langflow.field_typing import BaseLanguageModel
class MistralAIModelComponent(CustomComponent):
display_name: str = "MistralAI"
description: str = "Generate text using MistralAI LLMs."
icon = "MistralAI"
field_order = [
"model",
"mistral_api_key",
"max_tokens",
"temperature",
"mistral_api_base",
]
def build_config(self):
return {
"model": {
"display_name": "Model Name",
"options": [
"open-mistral-7b",
"open-mixtral-8x7b",
"open-mixtral-8x22b",
"mistral-small-latest",
"mistral-medium-latest",
"mistral-large-latest",
],
"info": "Name of the model to use.",
"required": True,
"value": "open-mistral-7b",
},
"mistral_api_key": {
"display_name": "Mistral API Key",
"required": True,
"password": True,
"info": "Your Mistral API key.",
},
"max_tokens": {
"display_name": "Max Tokens",
"field_type": "int",
"advanced": True,
"value": 256,
},
"temperature": {
"display_name": "Temperature",
"field_type": "float",
"value": 0.1,
},
"mistral_api_base": {
"display_name": "Mistral API Base",
"advanced": True,
"info": "Endpoint of the Mistral API. Defaults to 'https://api.mistral.ai' if not specified.",
},
"code": {"show": False},
}
def build(
self,
model: str,
temperature: float = 0.1,
mistral_api_key: Optional[str] = None,
max_tokens: Optional[int] = None,
mistral_api_base: Optional[str] = None,
) -> BaseLanguageModel:
# Set default API endpoint if not provided
if not mistral_api_base:
mistral_api_base = "https://api.mistral.ai"
try:
output = ChatMistralAI(
model_name=model,
api_key=(SecretStr(mistral_api_key) if mistral_api_key else None),
max_tokens=max_tokens or None,
temperature=temperature,
endpoint=mistral_api_base,
)
except Exception as e:
raise ValueError("Could not connect to Mistral API.") from e
return output
For the validation issues in the TransactionModel
, ensure that the target_args
field is correctly validated and parsed as JSON if it is a string. Here is the relevant part of the TransactionModel
with the target_args
validator:
class TransactionModel(BaseModel):
id: Optional[int] = Field(default=None, alias="id")
timestamp: Optional[datetime] = Field(default_factory=datetime.now, alias="timestamp")
source: str
target: str
target_args: dict
status: str
error: Optional[str] = None
class Config:
from_attributes = True
populate_by_name = True
# validate target_args in case it is a JSON
@validator("target_args", pre=True)
def validate_target_args(cls, v):
if isinstance(v, str):
return json.loads(v)
return v
This validator ensures that if target_args
is provided as a JSON string, it will be parsed into a dictionary. If target_args
is already a dictionary, it will be returned as is. This should resolve any validation issues related to the target_args
field in the TransactionModel
[1][2][3][4].
To continue talking to Dosu, mention @dosu.
@dosu I tried increasing the max tokens value to 1000 and now I'm getting a new error
To resolve the ValidationError
indicating that the sender_name
field in ChatOutputResponse
should be a valid string, but a generator object was provided instead, you need to ensure that the sender_name
field is assigned a string value and not a generator object.
Here's an example of how you might be incorrectly assigning a generator to sender_name
:
response = ChatOutputResponse(
message="Hello, world!",
sender="User",
sender_name=(name for name in ["AI"]) # This is a generator, which is incorrect
)
To fix this, you should assign a string directly to sender_name
:
response = ChatOutputResponse(
message="Hello, world!",
sender="User",
sender_name="AI" # This is a valid string
)
Ensure that wherever you are creating an instance of ChatOutputResponse
, the sender_name
is a string and not a generator or any other non-string type. The ChatOutputResponse
class, as defined, expects sender_name
to be an Optional[str]
with a default value of "AI"
, so providing a string directly will resolve the validation error [1][2].
To continue talking to Dosu, mention @dosu.
It looks like you selected the wrong field in the chatOutput component. Could you please provide a screenshot of your flow?
Describe the bug I have tried langflow with mistral ai and in the output I'm getting only a few letters of the output
Browser and Version Google Chrome -Version 125.0.6422.142
To Reproduce Steps to reproduce the behavior:
Screenshots
Additional context I have also attached a screen shot of the error showing in the terminal. Everything is working fine but the output is not being generated completely