Open javiertrigueros opened 3 weeks ago
@javiertrigueros, hello! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this challenge together!
The issue you're encountering with not being able to talk to a composite brain is not likely due to a CORS error. Instead, it seems to be related to the gpt_answer_generator
being None
in the create_stream_question_handler
function.
Here are the possible reasons for this:
gpt_answer_generator
might not have been properly initialized due to missing or incorrect parameters during its instantiation.gpt_answer_generator
relies on external services or dependencies that are not available or correctly configured, it could result in None
.gpt_answer_generator
from being set up correctly.gpt_answer_generator
depends on is down or unreachable, it could result in None
.gpt_answer_generator
from being assigned a valid instance [1].Additionally, ensure that the user_id
is provided and that there are connected brains. The CompositeBrainQA
class requires a user_id
during initialization, and if no connected brains are found, it defaults to using HeadlessQA
[2].
Check these aspects in your setup to resolve the issue.
To continue talking to Dosu, mention @dosu.
@dosu I have tried to create a "composite brain" from scratch. As a model I have put "gpt-3.5-turbo-0125" but with the same result. Is that parameter correct? What are the possible models?
The model parameter "gpt-3.5-turbo-0125" is correct and supported.
To resolve the AttributeError: 'NoneType' object has no attribute 'generate_stream'
in the create_stream_question_handler
function, ensure that the knowledge_qa
attribute is properly initialized and not None
. The possible models you can use are those supported by the QuivrRAG
class, which includes "gpt-3.5-turbo-0125" and other models listed in the model_compatible_with_function_calling
method.
Here is the relevant code snippet for initializing knowledge_qa
:
self.knowledge_qa = QuivrRAG(
model=self.brain.model if self.brain.model else self.model,
brain_id=brain_id,
chat_id=chat_id,
streaming=streaming,
max_input=self.max_input,
max_tokens=self.max_tokens,
**kwargs,
) # type: ignore
Ensure that self.brain.model
or self.model
is set to a valid model name like "gpt-3.5-turbo-0125".
The supported models include:
Additionally, ensure that the generate_stream
method is properly defined in the HeadlessQA
class [1][2][3].
To continue talking to Dosu, mention @dosu.
@dosu I have debugged the code and these are the values that the brain variable takes before entering gpt_answer_generator = chat_instance.get_answer_generator( brain=brain, chat_id=str(chat_id), model=brain.model, temperature=0.1, streaming=True, prompt_id=chat_question.prompt_id, user_id=current_user.id, user_email=current_user.email, )
these are:
backend-core | [WARNING] modules.chat.controller.chat_routes [chat_routes.py:84]: brain backend-core | [WARNING] modules.chat.controller.chat_routes [chat_routes.py:85]: brain_id=UUID('46e4f9ea-744e-4302-93fc-c1b2aac371a1') name='composite brain' description='This is a description of composite brain ' temperature=0.0 model='gpt-4o' max_tokens=2000 status='private' prompt_id=None last_update=datetime.datetime(2024, 6, 13, 8, 23, 16, 994515) brain_type=<BrainType.COMPOSITE: ' composite'> brain_definition=None connected_brains_ids=None raw=None jq_instructions=None integration=None integration_description=None backend-core | INFO: 172.25.0.1:33806 - "POST /chat/7e81d554-8c58-4d28-9605-29e7a796d484/question/stream?brain_id=46e4f9ea-744e-4302-93fc-c1b2aac371a1 HTTP/1.1" 500 Internal Server Error
The "doc" brains and integrations work correctly.
The issue seems to be that the connected_brains_ids
attribute of the brain
object is None
. For a composite brain, this attribute should contain the IDs of the connected brains. If it is None
, the CompositeBrainQA
class might not be able to generate a stream correctly.
Here are a few steps to resolve this:
Ensure connected_brains_ids
is Set:
Make sure that the connected_brains_ids
attribute is properly set when creating a composite brain. This attribute should contain a list of IDs of the brains that are part of the composite brain.
Check get_answer_generator
Implementation:
Verify that the get_answer_generator
method in the BrainfulChat
class correctly handles composite brains. It should return an instance of CompositeBrainQA
when the brain type is BrainType.COMPOSITE
.
Update create_stream_question_handler
:
Ensure that the create_stream_question_handler
function checks for the presence of connected brains and handles the case where connected_brains_ids
is None
.
Here is an example of how you might update the create_stream_question_handler
function to handle this:
@chat_router.post(
"/chat/{chat_id}/question/stream",
dependencies=[
Depends(
AuthBearer(),
),
],
tags=["Chat"],
)
async def create_stream_question_handler(
request: Request,
chat_question: ChatQuestion,
chat_id: UUID,
brain_id: Annotated[UUID | None, Query()] = None,
current_user: UserIdentity = Depends(get_current_user),
) -> StreamingResponse:
chat_instance = BrainfulChat()
chat_instance.validate_authorization(user_id=current_user.id, brain_id=brain_id)
logger.info(
f"Creating question for chat {chat_id} with brain {brain_id} of type {type(brain_id)}"
)
brain = brain_service.get_brain_by_id(brain_id)
if brain.brain_type == BrainType.COMPOSITE and not brain.connected_brains_ids:
raise HTTPException(status_code=400, detail="Composite brain must have connected brains")
gpt_answer_generator = chat_instance.get_answer_generator(
brain=brain,
chat_id=str(chat_id),
model=brain.model,
temperature=0.1,
streaming=True,
prompt_id=chat_question.prompt_id,
user_id=current_user.id,
user_email=current_user.email,
)
maybe_send_telemetry("question_asked", {"streaming": True}, request)
if gpt_answer_generator is None:
raise HTTPException(status_code=500, detail="Failed to generate answer generator")
try:
return StreamingResponse(
gpt_answer_generator.generate_stream(
chat_id, chat_question, save_answer=True
),
media_type="text/event-stream",
)
except HTTPException as e:
raise e
This code checks if connected_brains_ids
is None
for a composite brain and raises an HTTPException
if it is, ensuring that the composite brain is correctly configured before attempting to generate a stream [1].
Additionally, ensure that the CompositeBrainQA
class is correctly implemented to handle the generation of answers for composite brains, including the use of connected brains [2].
To continue talking to Dosu, mention @dosu.
@donsu Could you give me an example of how to create "composite brains" with the API. I have created several but I think I am missing something since they do not work.
To create composite brains with the API, you need to ensure that the connected_brains_ids
attribute is properly set. Here is a detailed example of how to do this using the create_brain_composite
method in the BrainService
class:
from uuid import UUID
from modules.brain.dto.inputs import CreateBrainProperties
from modules.brain.entity.brain_entity import BrainType
from modules.brain.service.brain_service import BrainService
from modules.brain.entity.composite_brain_connection_entity import CompositeBrainConnectionEntity
# Initialize the BrainService
brain_service = BrainService()
# Define the properties for the new composite brain
new_composite_brain_properties = CreateBrainProperties(
brain_type=BrainType.COMPOSITE,
connected_brains_ids=[
UUID("123e4567-e89b-12d3-a456-426614174000"), # Example connected brain ID 1
UUID("123e4567-e89b-12d3-a456-426614174001") # Example connected brain ID 2
]
)
# Create the composite brain
created_composite_brain = brain_service.create_brain_composite(new_composite_brain_properties)
# Connect the brains
for connected_brain_id in new_composite_brain_properties.connected_brains_ids:
connection = CompositeBrainConnectionEntity(
composite_brain_id=created_composite_brain.id,
connected_brain_id=connected_brain_id
)
brain_service.connect_brain(connection)
# Output the created composite brain details
print(created_composite_brain)
In this example:
CreateBrainProperties
object is initialized with the brain_type
set to BrainType.COMPOSITE
and the connected_brains_ids
attribute set to a list of UUIDs representing the connected brains.create_brain_composite
method is called to create the composite brain.CompositeBrainConnectionEntity
class is used to create connections for each brain by calling the connect_brain
method with the appropriate composite_brain_id
and connected_brain_id
.Ensure that the connected_brains_ids
attribute is correctly set with valid UUIDs of the brains you want to connect to the composite brain [1][2][3].
To continue talking to Dosu, mention @dosu.
@dosu When I created a "composite brain" from the backend it gives me this error. In the database it creates the brain but not the references in the composite_brain_connections table. The error is "AttributeError: 'BrainService' object has no attribute 'composite_brains_connections_repository'" which looks like that is why my "compose brain" is not working normally.
Note: I do create the references of my "composite brain" manually in the database. Adding references to the brains_users and composite_brain_connections tables. Doesn't work either
ERROR: Exception in ASGI application backend-core | Traceback (most recent call last): backend-core | File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi backend-core | result = await app( # type: ignore[func-returns-value] backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ backend-core | File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call backend-core | return await self.app(scope, receive, send) backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ backend-core | File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call backend-core | await super().call(scope, receive, send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in call backend-core | await self.middleware_stack(scope, receive, send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call backend-core | raise exc backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call backend-core | await self.app(scope, receive, _send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in call backend-core | await self.simple_response(scope, receive, send, request_headers=headers) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 148, in simple_response backend-core | await self.app(scope, receive, send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in call backend-core | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app backend-core | raise exc backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app backend-core | await app(scope, receive, sender) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in call backend-core | await self.middleware_stack(scope, receive, send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app backend-core | await route.handle(scope, receive, send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle backend-core | await self.app(scope, receive, send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app backend-core | await wrap_app_handling_exceptions(app, request)(scope, receive, send) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app backend-core | raise exc backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app backend-core | await app(scope, receive, sender) backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app backend-core | response = await func(request) backend-core | ^^^^^^^^^^^^^^^^^^^ backend-core | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app backend-core | raw_response = await run_endpoint_function( backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ backend-core | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function backend-core | return await dependant.call(**values) backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ backend-core | File "/code/modules/brain/controller/brain_routes.py", line 106, in create_new_brain backend-core | new_brain = brain_service.create_brain( backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^ backend-core | File "/code/modules/brain/service/brain_service.py", line 133, in create_brain backend-core | return self.create_brain_composite(brain) backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ backend-core | File "/code/modules/brain/service/brain_service.py", line 174, in create_brain_composite backend-core | self.composite_brains_connections_repository.connect_brain( backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ backend-core | AttributeError: 'BrainService' object has no attribute 'composite_brains_connections_repository'
To resolve the issue where the composite_brains_connections_repository
attribute is missing in the BrainService
class, you need to ensure that this attribute is properly initialized. Here is how you can do it:
Import the CompositeBrainsConnections
class:
Ensure that the CompositeBrainsConnections
class is imported in the brain_service.py
file.
Initialize the composite_brains_connections_repository
attribute:
Add the initialization of the composite_brains_connections_repository
attribute in the BrainService
class constructor.
Here is an example of how you can update the BrainService
class:
from modules.brain.repository.composite_brains_connections import CompositeBrainsConnections
class BrainService:
brain_repository: BrainsInterface
brain_user_repository: BrainsUsersInterface
brain_vector_repository: BrainsVectorsInterface
external_api_secrets_repository: ExternalApiSecretsInterface
integration_brains_repository: IntegrationBrainInterface
integration_description_repository: IntegrationDescriptionInterface
composite_brains_connections_repository: CompositeBrainsConnections
def __init__(self):
self.brain_repository = Brains()
self.brain_user_repository = BrainsUsers()
self.brain_vector_repository = BrainsVectors()
self.external_api_secrets_repository = ExternalApiSecrets()
self.integration_brains_repository = IntegrationBrain()
self.integration_description_repository = IntegrationDescription()
self.composite_brains_connections_repository = CompositeBrainsConnections()
# other methods
By ensuring that the composite_brains_connections_repository
attribute is properly initialized, the create_brain_composite
method should be able to use it without encountering the AttributeError
[1][2].
To continue talking to Dosu, mention @dosu.
What happened?
Thank you very much for your great project.
The entire application works correctly, I can talk to the different brains but when I create a composite brain I can't talk to it. I suspect a possible cords error or something similar, but I'm not sure.
Relevant log output
Twitter / LinkedIn details
No response