Open coffeebe4code opened 4 months ago
Hello, source code context has not yet been enabled in the Answer Engine (the homepage search feature), but we expect to make it available very soon (e.g 0.14)
While my question was directed to be the Answer Engine, and it's good to hear that it will be added soon! Thank you very much for your quick reply. My intent was not exclusive to the Answer Engine.
I was unclear when I said "How do I configure a chat instance or something". So I am sorry about that!
I am interested in how I can get it working right now. So if there is another way to run or provide the vectorizations to a query, whether through vim extension or just the server api. Please let me know.
Downloaded 0.13 tabby, installed and ran successfully on port 8080 with DeepseekCoder-6.7B and Mistral-7B
Using Cuda.
When asking the tabby assistant on the webpage, or after clicking ask tabby in the repository with the file in question. It seemingly has no access to the context.
After asking it to list the first 5 lines of the file, it fails miserably.
How do I configure a chat instance or something to actually use the indexed repository
The index does show success.