Closed ZainGithub12 closed 3 months ago
Based on my experience, the 3.1 models are not yet stable on Groq. I'm delaying their inclusion for now, but I'll monitor their progress closely. Once they become more reliable, I plan to update and incorporate these new models.
@YassKhazzan, I'm just a general user of OpenPerplex, not a developer, and happy to be the first to open an issue and give feedback!
I'm just wondering if you are planning to use the latest Llama Groq models, e.g. Llama-3.1-70b versatile, because the Llama-3-70b that's being used on OpenPerplex.com isn't the latest model. If not, that's fine. What's the reason for using Llama-3-70b? 🙂