-
### Issue Description
Hello,
I am trying to generate an explanation of abstractive text summarization output for a long piece of text input. I have been using various transformers models e.g. Big…
SVC04 updated
8 months ago
-
I am having a go at running inference and evaluation for this model, and running into a TypeError in `GPTLMHeadModel`:
```
In [1]: import torch
...: from transformers import AutoTokenizer
…
-
(dynamicrafter) D:\AI software\DynamiCrafter-main>python gradio_app_interp_and_loop.py
WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
PyTorch 2.0.0+cu118 wit…
-
### Checklist
- [X] The issue exists after disabling all extensions
- [X] The issue exists on a clean installation of webui
- [ ] The issue is caused by an extension, but I believe it is caused by a …
-
Hi, thank you so much for a wonderful tool!
I was trying to use HaplotagLR on a heterozygous mouse dataset and getting the error below. The test data were processed with no errors. Do you have any …
-
Here is the traceback:
Build model...
Traceback (most recent call last):
File "ram.py", line 117, in
model = RecurrentAttentionMemory()
File "ram.py", line 86, in __init__
x = sha…
-
### Request description
E2E tests suite for the Attention that has reference implementation in it.
### What component(s) does this issue relate to?
Compiler
### Additional context
I ra…
-
I follow the readme :
## Build model with both INT8 weight-only and INT8 KV cache enabled
python convert_checkpoint.py --model_dir ./bloom/560m/ \
--dtype float16 \
…
-
### Is there an existing issue for this?
- [X] I have searched the existing issues and checked the recent builds/commits
### What happened?
### Steps to reproduce the problem
python 3.10, …
-
### Please check that this issue hasn't been reported before.
- [X] I searched previous [Bug Reports](https://github.com/OpenAccess-AI-Collective/axolotl/labels/bug) didn't find any similar reports.
…