-
### Your current environment
For setup, I am using the version 0.5 and the vllm_openai target as part of the Dockerfile with these arguments:
```
environment:
- NCCL_SOCKET_IFNAME=eth0
…
-
Below code works when I am using Mixtral model from Ollama directly. But when I use the IPEX-LLM optimized Mixtral model, the tool does not work. This is an easy tool for testing functionality , whic…
-
For fully private deploys
Let's test out 4bit mixtral, and adapt `pkg/dataprep/text/dynamic.go` to call into it
-
when i do node main.js it outputs:
[ './andy.json' ]
Starting agent with profile: ./andy.json
Starting agent initialization with profile: ./andy.json
Initializing action manager...
Initializing…
-
**Problem**
Jan is great, but I'm limited o the number of models I can run on my 16GB GPU. I saw there is a project called [mixtral-offloading](https://github.com/dvmazur/mixtral-offloading) that cou…
-
### System Info
While building TensorRT engines for Mixtral model Mixtral-8x7B-Instruct-v0.1, ran into this error.
Loading checkpoint shards: 21%|██████████████████████████████████▌ …
-
As a [user],
I want a drop down menu in the Model accordion to display different model options.
So that I can choose the model that best fits my needs.
Acceptance Criteria:
1. The AI Model accordi…
-
This is not really an issue, but I couldn't find any other way to contact you. I was trying to follow your instructions on https://www.philschmid.de/sagemaker-deploy-mixtral and ended up in this repos…
-
I cant seem to get this working. I did ollama pull dolphin-mixtral:latest, it pulled it, then i do in a tmux instance ollama serve, then i try to use my bot with dolphin-mixtral:latest in the INIT env…
-
**Describe the bug**Mistral and Mixtral models not able to infer
When i give the name of the model as i do for other models in case of mistral there is a key error from the configuration_auto.py file…