kolbytn / mindcraft

MIT License
503 stars 59 forks source link

There is a small but extremely fast service called 'Groq' #108

Open FateUnix29 opened 3 weeks ago

FateUnix29 commented 3 weeks ago

NOTE: It Mixtral can at times be... Fragile. Let's call it that. Keep the temperature LOW. You can indeed drive it nuts, at least with the system prompt I was using.

I intend to make a fork that just barely supports this service. GroqCloud indeed supports JavaScript-based inferences/completions.

The thing here is that Groq is, in fact, not it's own AI model. It's a service.

You can choose multiple Groq-provided models, including a new one, Mixtral-8x7b with a 32,768 max token limit.

The JavaScript code snippet is provided by GroqCloud itself, just enter the parameters (including the system prompt) and ask it to give you the code. It's also a small snippet, thank god.

Mixtral-8x7b on GroqCloud can be found here

I have personal experience with GroqCloud and I fully intend to test out function calls myself with mixtral.

(Note: If it was not clear, I have little experience in JavaScript.)

FateUnix29 commented 3 weeks ago

Oh! They also appear to have documentation. Posting that here as well.

FateUnix29 commented 2 weeks ago

UPD: I know the very basics about JavaScript. I may eventually learn enough to submit a PR, but that's a long while away.

Additionally, I forgot to mention that I went into GroqCloud and gave mixtral-8x7b-32768 a function call test. The function calls looked accurate and they would probably work in-game. Mixtral is viable.

FateUnix29 commented 2 weeks ago

https://github.com/kolbytn/mindcraft/pull/115