As of v. 0.1.33, Ollama supports request concurrency.
To reflect that, a Concurrent Inferences input was added to the settings panel, where users can set the number of concurrent calls sent to the server (up to a max of 5, for now).
Due to an issue with the implementation of atomWithLocalStorage, however, this field will be empty, instead of displaying the default value of 1, until the user sets a value and updates the form.
Help wanted:
Due to time constraints, I have been unable to implement a version of Jotai's atomWithLocalStorage that recursively checks if the nested keys for the config atom are stored in localStorage, returning their default value instead of an empty value if that key does not exist.
Any help implementing that would be much appreciated.
As of v. 0.1.33, Ollama supports request concurrency.
To reflect that, a
Concurrent Inferences
input was added to the settings panel, where users can set the number of concurrent calls sent to the server (up to a max of 5, for now).Due to an issue with the implementation of
atomWithLocalStorage
, however, this field will be empty, instead of displaying the default value of 1, until the user sets a value and updates the form.Help wanted: Due to time constraints, I have been unable to implement a version of Jotai's
atomWithLocalStorage
that recursively checks if the nested keys for theconfig
atom are stored in localStorage, returning their default value instead of an empty value if that key does not exist.Any help implementing that would be much appreciated.