-
-
**Before you post**
A request for help or a requests for a how-to should be directed to [Phind]([url](https://www.phind.com/search?c=I%27m%20using%20the%20Cytoscape.js%20graph%20theory%20JS%20libra…
-
Use same GGUF format that was used to eval v1
#94 opened to track pulling up AWQ and vLLM to latest and trying that out with this model
-
https://github.com/xenodium/chatgpt-shell/issues/144
Perhaps a phindai?
phind is a good free (without expiring time) and paid AI for coding.
Please see my request at chatgpt-shell
https://github…
-
1. Create a file called "GameViewSetTest.py" at [backend/rorapp/tests](https://github.com/iamlogand/republic-of-rome-online/tree/main/backend/rorapp/tests).
2. In this script, create a class called `…
-
Greetings!
I'm trying to use a library to get a short description of the text of a news story that I get after scraping a news site. I use this short description for publication in the telegram chann…
-
**Describe the bug**
**Steps to reproduce**
Steps to reproduce the behavior:
1. Download the model Wizard Coder Python 13B Q5 in the Model Hub
2. Start the model and start the conversation
…
-
Hello, I built the triton 23.12 container with the trtllm 0.7.1 backend using the 3rd build option in the triton trtllm guide, and deployed 2 models, Mistral 7b instruct and phind codellama v2 34b, an…
-
I'm getting error -
> g4f.Provider.Bing supports: (
model,
messages: Messages,
proxy: str = None,
cookies: dict = None,
tone: str = Creative,
image: str = None
)
Us…
-
### What happened?
This snippet works
```
response = litellm.completion(
model="gemini/gemini-pro",
messages=[{"role": "user", "content": "write code for saying hi from LiteLLM"}]
)…