-
A project called the AI Dungeon (https://colab.research.google.com/github/nickwalton/AIDungeon/blob/master/AIDungeon_2.ipynb) supposedly uses the 1558M model to generate stories in Collaboratory. Howe…
r3ndd updated
4 years ago
-
I am a noob. Here is my code, how can I modify it to do batch inferring?
---
def load_model():
model_id = 'llama3/Meta-Llama-3-70B-Instruct'
pipeline = transformers.pipeline(
"t…
-
## Description
After https://github.com/dmlc/gluon-nlp/pull/1356 (Thanks @szha and @leezu!), GluonNLP has now fully embraced the new Gluon 2.0 API. We will no longer need to worry about the `hybrid_f…
-
I mused about this briefly in
and https://github.com/projectatomic/rpm-ostree/pull/697
https://github.com/projectatomic/rpm-ostree/pull/728
Basically, I'd like us to answer the question of "Is m…
-
Checklist:
* [x] I've searched in the docs and FAQ for my answer: https://bit.ly/argocd-faq.
* [x] I've included steps to reproduce the bug.
* [x] I've pasted the output of `argocd version`.
…
-
### System Info
pandasai 2.2.14
python 3.12
### 🐛 Describe the bug
https://github.com/Sinaptik-AI/pandas-ai/blob/05431072676d44d409c6c95620c6f561370ec3ef/pandasai/pipelines/chat/prompt_generation.…
-
非常感谢你的开源工作,但是在下载你的pretrained model,碰到了bug,因此我下载了llava-7b-lora的model,跟issue5的碰到的一样,测试scienceqa的命令行如下:
```
python -m llava.eval.model_vqa_science \
--model-base /mnt/xiaofeng.zxf/models/vicuna-7b…
-
I am trying to use the cuSOLVERMp library on Perlmutter. I built the examples and tried to run them on a GPU node with
`srun -n 2 ./mp_potrf_potrs -verbose 1`
The output is attached in out.txt. I al…
s769 updated
11 hours ago
-
Hi,
I was reading the article [Inflection-2.5: meet the world's best personal AI](https://inflection.ai/inflection-2-5), and in the article it was mentioned that `nearly 25%—of examples in the reason…
-
**Description**
It is now the second time in a short time, where the `proto.lock` hasn't been updated.
First time:
https://github.com/camunda/zeebe/pull/11807
Second time:
```
$ gi…