-
### Describe the issue
Issue:I encountered the following problem
>>>[Usage] ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' #1101
There was a version compatibility issue …
-
### What happened + What you expected to happen
Unable to connect to a ray cluster with via the client in version 2.7.0 when specifying a working_dir.
Works in 2.6.3
### Versions / Dependencies
ra…
-
#### ALL software version info
(this library, plus any other relevant software, e.g. bokeh, python, notebook, OS, browser, etc should be added within the dropdown below.)
Software Version In…
-
### Describe the issue as clearly as possible:
I'm trying to use Ollama with the Openai api to generate a Pydantic object.
I think that's because Ollama doesn't support `response_format` which is …
-
I am trying to follow the instructions on the readme
from sentence_transformers import SentenceTransformer
model = SentenceTransformer('paraphrase-distilroberta-base-v1')
sentences = ['This fr…
-
I have been running pyscenic ctx for over 5 days. My unfiltered loom file contains ~7k cells. How long does this step typically take? I am running it from a slurm script on my HPC using the following …
-
### Describe the bug
When I try to serve a llama 3.1 8B-4bit with openllm, it says that "This model's maximum context length is 2048 tokens".
On https://huggingface.co/meta-llama/Meta-Llama-3.1-8B,…
-
### What happened?
from spectrochempy import *
Traceback (most recent call last):
File C:\ProgramData\anaconda3\envs\scpy\Lib\site-packages\spectrochempy\api.py:116
IP.run_line_magic("matp…
-
@twopirllc you are closing issues well, so I created a new one with extra code that you asked here https://github.com/twopirllc/pandas-ta/issues/826
---
OS: Ubuntu 20.04.6 LTS
Server: Intel(R…
-
I'm not sure if this is officially supported already, but since there are leaderboard results at the Kaggle site, I suppose it should work. Default mode seems to work as expected. I used the latest ve…