Closed slobentanzer closed 9 months ago
@slobentanzer As we had moved import xinference
to Constructor, I imported xinference.client.Client
at the top of test_llm_connect.py
. So, we need to mock "xinference.client.Client"
instead:
@pytest.fixture
def xinference_conversation():
with patch("xinference.client.Client") as mock_client: # mock xinference
mock_client.return_value.list_models.return_value = xinference_models
mock_client.return_value.get_model.return_value.chat.return_value = (
{"choices": [{"message": {"content": "Human message"}}]},
{"completion_tokens": 0},
)
conversation = XinferenceConversation(
base_url="http://llm.biocypher.org",
prompts={},
correct=False,
)
return conversation
makes sense, thanks :)
@fengsh27 on the current PR #102, CI fails because the Client is not found in the
llm_connect.py
module. If I remember correctly, you moved the import into theXinferenceConversation
to keep the server lightweight, right?https://github.com/biocypher/biochatter/blob/2982143010e8e268207fa6dc8af8bb50ecfdd108/biochatter/llm_connect.py#L412
Is this still necessary, so do we need to think about how to adjust the test so it does not fail, or should we rather move the Client import back to the top?