Closed binarynoise closed 2 months ago
Thanks for the contribution!
There's currently another PR that's almost ready to be merged and that moved the getModelList()
logic elsewhere, so this sorting function will also have to be moved.
Let's put this on hold until #99 is merged. I'll let you know when it is.
Also, I'd like to add some test assertions for the sorting function.
I think the simplest implementation would be to add a new entry in MOCK_API_TAGS_RESPONSE
, for example:
models: [
{ ... },
{
name: "TheBloke/CodeLlama-70B-Python-AWQ",
model: "TheBloke/CodeLlama-70B-Python-AWQ",
modified_at: "2023-12-01T10:06:18.679361519-05:00",
size: 4108928240,
digest: "ca4cd4e8a562465d7cf0512fc4d4dff3407f07800b01c9a6ee9dd107001704b9",
details: {
parent_model: "",
format: "gguf",
family: "llama",
families: null,
parameter_size: "70B",
quantization_level: "Q4_0"
}
}
]
And then in test("displays model list and updates settings store")
we should expect the order of the models to be sorted correctly:
// Check if the model list contains the expected models
await expect(modelSelect).toContainText(MOCK_API_TAGS_RESPONSE.models[0].name);
await expect(modelSelect).toContainText(MOCK_API_TAGS_RESPONSE.models[1].name);
await expect(modelSelect).toContainText(MOCK_API_TAGS_RESPONSE.models[2].name);
await modelSelect.selectOption(MOCK_API_TAGS_RESPONSE.models[1].name);
// Check if the settings store is updated with the selected model
const localStorageValue = await page.evaluate(() => window.localStorage.getItem('hollama-settings'));
if (!localStorageValue) throw new Error('No local storage value');
const parsedLocalStorageValue = JSON.parse(localStorageValue);
expect(parsedLocalStorageValue.ollamaModel).toBe(MOCK_API_TAGS_RESPONSE.models[1].name);
// Check that the models are sorted alphabetically (excluding 3rd party repositories)
expect(parsedLocalStorageValue.ollamaModels[0].name).toBe(MOCK_API_TAGS_RESPONSE.models[2].name);
expect(parsedLocalStorageValue.ollamaModels[1].name).toBe(MOCK_API_TAGS_RESPONSE.models[0].name);
expect(parsedLocalStorageValue.ollamaModels[2].name).toBe(MOCK_API_TAGS_RESPONSE.models[1].name);
:+1: I'll take another look when the other PR is merged. I would defer the testing until then too. Luckily ts isn't too crazy, so writing tests will probably work for me.
For me, having the path locally already is a great help (I currently have about 40 models lying around waiting to be decided on whether they work for me or don't).
@binarynoise alright, I think you can proceed 👍
The best place to sort the models now would be in ollamaTags()
.
Also, any chance we can avoid using the non-null !
's?
As a reference, we have a similar sorting function that has some guard clauses to avoid !
's.
I reimplemented the sort.
I decided not to split the repo off, because in a private project I don't do that either.
That way, we don't have that !
problem any more.
@binarynoise thank you, this looks great!
Any chance you can update the test? I think you can mostly copy/paste my suggestion here: https://github.com/fmaclen/hollama/pull/120#issuecomment-2254157126
Here you go, I added the tests.
:tada: This PR is included in version 0.9.4 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
@binarynoise thank you for the PR! Hoping to get #124 out soon.
Closes #104