biocypher / biochatter

Backend library for conversational AI in biomedicine
http://biochatter.org/
MIT License
69 stars 23 forks source link

Test mocking fails for Xinference Client #103

Closed slobentanzer closed 9 months ago

slobentanzer commented 9 months ago

@fengsh27 on the current PR #102, CI fails because the Client is not found in the llm_connect.py module. If I remember correctly, you moved the import into the XinferenceConversation to keep the server lightweight, right?

https://github.com/biocypher/biochatter/blob/2982143010e8e268207fa6dc8af8bb50ecfdd108/biochatter/llm_connect.py#L412

Is this still necessary, so do we need to think about how to adjust the test so it does not fail, or should we rather move the Client import back to the top?

AttributeError: <module 'biochatter.llm_connect' from '/home/runner/work/biochatter/biochatter/biochatter/llm_connect.py'> does not have the attribute 'Client'
fengsh27 commented 9 months ago

@slobentanzer As we had moved import xinference to Constructor, I imported xinference.client.Client at the top of test_llm_connect.py. So, we need to mock "xinference.client.Client" instead:

@pytest.fixture
def xinference_conversation():
    with patch("xinference.client.Client") as mock_client:   # mock xinference
        mock_client.return_value.list_models.return_value = xinference_models
        mock_client.return_value.get_model.return_value.chat.return_value = (
            {"choices": [{"message": {"content": "Human message"}}]},
            {"completion_tokens": 0},
        )
        conversation = XinferenceConversation(
            base_url="http://llm.biocypher.org",
            prompts={},
            correct=False,
        )
    return conversation

See https://github.com/biocypher/biochatter/blob/2982143010e8e268207fa6dc8af8bb50ecfdd108/test/test_llm_connect.py#L172

slobentanzer commented 9 months ago

makes sense, thanks :)