Open sananand007 opened 1 week ago
well the scheduler
is likely not needed anymore as the server takes up the config.toml
.
I was able to index the repository using intuition but I don't think the answer engine is connected to the indexed data ? How do you invoke a chat sequence with the indexed data ?
Describe the bug
I was looking for a way to index my
local
repository on a M1/M2 Mac, I have a~/.tabby/config.ml
file which is as suchtabby version
My intention is to index the repository and then check if the
chat
window on the server GUI can give me some information on the indexed data ?I saw indexing was possible with
tabby scheduler
option which is un-available right now ? How can I get this doneInformation about your version Please provide output of
tabby --version
Information about your GPU Please provide output of
nvidia-smi
Additional context Add any other context about the problem here.