dezoito / ollama-grid-search

A multi-platform desktop application to evaluate and compare LLM models, written in Rust and React.
MIT License
511 stars 31 forks source link

Limited Concurrency #29

Closed dezoito closed 6 months ago

dezoito commented 6 months ago

As of v. 0.1.33, Ollama supports request concurrency.

To reflect that, a Concurrent Inferences input was added to the settings panel, where users can set the number of concurrent calls sent to the server (up to a max of 5, for now).

Due to an issue with the implementation of atomWithLocalStorage, however, this field will be empty, instead of displaying the default value of 1, until the user sets a value and updates the form.

Help wanted: Due to time constraints, I have been unable to implement a version of Jotai's atomWithLocalStorage that recursively checks if the nested keys for the config atom are stored in localStorage, returning their default value instead of an empty value if that key does not exist.

Any help implementing that would be much appreciated.

dezoito commented 6 months ago

Added in v0.5.0