jackabald / TiDB-Hack-NL-repo-search

Semantic Search Engine for Code Repositories
Apache License 2.0
5 stars 2 forks source link

Update the Frontend #15

Closed jackabald closed 2 months ago

jackabald commented 3 months ago
  1. There should be a button or some sort of functionality to index the repository before the query, the repo should not be indexed everytime there's a query.
  2. The frontend should allow for a more "chat like" experience rather than 1 question that's all.
afzal442 commented 3 months ago

That's true. I observed that too. Will hack around this.

On Thu, 8 Aug 2024, 07:14 Jack Archibald, @.***> wrote:

  1. There should be a button or some sort of functionality to index the repository before the query, the repo should not be indexed everytime there's a query.
  2. The frontend should allow for a more "chat like" experience rather than 1 question that's all.

— Reply to this email directly, view it on GitHub https://github.com/jackabald/TiDB-Hack-NL-repo-search/issues/15, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACYWJSHSIQIPSTPN6VPRQI3ZQLEP7AVCNFSM6AAAAABMFPTWK6VHI2DSMVQWIX3LMV43ASLTON2WKOZSGQ2TINRYG42DSNA . You are receiving this because you are subscribed to this thread.Message ID: @.***>

jackabald commented 2 months ago

https://docs.llamaindex.ai/en/latest/module_guides/deploying/query_engine/usage_pattern/

information on how to get a "streaming response," this will reduce wait times by printing out words as the LLM loads them. Additionally if we configure it right it will make our frontend look more like a chat bot.

https://docs.streamlit.io/develop/tutorials/llms/build-conversational-apps

This is some streamlit tutorials for making a chat bot.

afzal442 commented 2 months ago

Thanks I will look into it by tomorrow.

afzal442 commented 2 months ago

Now the new UI supports the Chatbot feat, but the response is not format as per the query. Could you look into it and change the logic as per the project's goal?

image

jackabald commented 2 months ago

Screenshot from 2024-08-10 20-58-11 So it looks like it does have the functionality to return code snippets. But as you can see it clearly gets my first query wrong. I don't know if this is something that is directly fixable, I think playing around with different LLMs could possibly change results but I'm not really sure. Do you have any ideas?

afzal442 commented 2 months ago

Thanks for testing that out. That means it's working fine.

I think playing around with different LLMs could possibly change results but I'm not really sure. Do you have any ideas?

I think you are right. I have ideas but let's see if time permits, we will do that.

afzal442 commented 2 months ago

Until then you can try out exploring like giving options to users to use DifyAI or LeptonAI models etc.

jackabald commented 2 months ago

I will explore the metadata filters and see if I can use them to improve results.

jackabald commented 2 months ago

https://www.youtube.com/watch?v=u5Vcrwpzoz8

super interesting video relating to this topic.

afzal442 commented 2 months ago

Completed