chiragn888 / video.js

MIT License
0 stars 0 forks source link

Sweep: Chat doesn't use indexed data in vespa when using Ollama (Llama 2) #2

Open chiragn888 opened 9 months ago

chiragn888 commented 9 months ago

Using ollama as LLM I am not able to use indexed data in vespa. I can see the document is indexed in Vespa by running the command

vespa query 'yql=select from sources where true

However asking a very simple question doesn't use this information.

sweep-ai[bot] commented 9 months ago
Sweeping

0%

Actions (click)


❌ Unable to Complete PR

You ran out of the free tier GPT-4 tickets! We no longer support running Sweep with GPT-3.5 as it is too unreliable. Here are your options:


🎉 Latest improvements to Sweep:
  • New dashboard launched for real-time tracking of Sweep issues, covering all stages from search to coding.
  • Integration of OpenAI's latest Assistant API for more efficient and reliable code planning and editing, improving speed by 3x.
  • Use the GitHub issues extension for creating Sweep issues directly from your editor.

💡 To recreate the pull request edit the issue title or description. To tweak the pull request, leave a comment on the pull request.

This is an automated message generated by Sweep AI.