Closed iQiexie closed 2 months ago
Hm, the app by default uses a Cloud inference provider called Groq that offers Llama-3.1-70b. There should not be any significant hardware requirements to call the API. Is there a different requirement you are referring to?
Oh sorry. For some reason, I thought it's a completely self-hosted project
Please add hardware requirements in the readme