-
### Is there an existing issue for the same bug?
- [X] I have checked the troubleshooting document at https://opendevin.github.io/OpenDevin/modules/usage/troubleshooting
- [X] I have checked the exis…
-
### 前置阅读 | Pre-reading
- [X] [Issues](https://github.com/Kuingsmile/word-gpt-plus/issues?q=is%3Aissue+sort%3Aupdated-desc+is%3Aclosed)
- [X] [README](https://github.com/Kuingsmile/word-gpt-plus/blob/…
-
### Description
Add support for API at `groq.com`.
### Suggested Solution
groq api is compatible with openai api so basically no additional functionality is required.
### Alternatives
I achieve…
-
### Is there an existing feature or issue for this?
- [X] I have searched the existing issues
### Expected feature
ollama is very slow on VPS, no GPU.
Can you add Groq API?
https://console.gr…
-
I was trying the **LLM extraction** strategy to extract data from a website using GROQ api as a LLM backend with groq/llama3-70b-8192 and i am having an error
`[ERROR] 🚫 Failed to crawl https://w…
-
![image](https://github.com/stevennt/myai.abn.khoj/assets/32351181/c62efeda-62b7-4093-b70e-4b64eaefc9f8)
-
The [Groq API](https://console.groq.com/docs/api-reference#chat-create) provides high level of compatibility with the OpenAI API.
The existing Spring AI OpenAI model client can be used to access the …
-
[Groq](https://groq.com/) has tremendous inference speeds (280 tokens per second for Llama 3 70B and 877 tokens per second for Llama 8B0. It would be amazing to get support for this in Jupyter AI.
-
**Describe the bug**
I'm getting an error when running the Groq example on the repo. I confirmed my Groq key works when making a normal request.
*Code*
```python
from scrapegraphai.graphs import…
-
Hello
I am a beginner-level user of PrivateGPT and set it for 'local' with LLM as mistral-7b-instruct-v0.2.Q4_K_M.gguf.
Please advise me how to add Groq (OpenAI compatible LLM service - https://…