I noticed that the after asking it usually give answer very approx 2min so I was thinking to play around other LLM Infrence Model like Groq to make the output real-time.
Does this project active and how I can contribute to this with other open source models
I have setup the project on Google colab
It would be great if you can guide me in the process @gcapuzzi
I noticed that the after asking it usually give answer very approx 2min so I was thinking to play around other LLM Infrence Model like Groq to make the output real-time.
Does this project active and how I can contribute to this with other open source models
I have setup the project on Google colab![image](https://github.com/hyperledger-labs/aifaq/assets/98694380/d3de356a-082a-4abc-b2ad-4f1a4397dba4)
It would be great if you can guide me in the process @gcapuzzi