issues
search
FlagAI-Open
/
FlagAI
FlagAI (Fast LArge-scale General AI models) is a fast, easy-to-use and extensible toolkit for large-scale model.
Apache License 2.0
3.82k
stars
415
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Add CCI3-HQ scripts.
#573
ftgreat
closed
6 days ago
0
Add CCI3 scripts.
#572
ftgreat
closed
6 days ago
0
[Question]: Why not call optimizer.zero_grad() in train_step_pytorch and train_step_pytorchDDP function?
#570
THU-Kingmin
closed
5 days ago
1
[Question]:ALtCLIp : local variable 'model_id' referenced before assignment
#569
molu-ggg
opened
6 months ago
0
[Question]: TypeError: __init__() got multiple values for argument 'text_config'
#568
molu-ggg
opened
6 months ago
0
Aquila2推理优化:请问Aquila2-34b在量化后还想使用单机多卡和多机多卡并行推理以提高效率,该如何实现呢?
#567
monkeyZhy
opened
6 months ago
0
[Question]: 请问目前支持Langchain框架吗
#566
PlatinumDiors
opened
6 months ago
0
Windows bug ModuleNotFoundError: No module named 'torch._six'
#565
MGzhou
opened
8 months ago
0
Update python-app.yml
#564
zacliu2023
closed
8 months ago
0
Update python-app.yml
#563
zacliu2023
closed
8 months ago
0
[Question]: alm的阿拉伯语有没有文本分类的例子?
#562
handywc
opened
8 months ago
0
AltCLIP has no effect on CIFAR10 after finetune
#561
youyuge34
opened
8 months ago
1
Do you have this dataset?[Question]:
#560
kopyl
closed
8 months ago
0
[Question]: 请问AltCLIP有没有提供使用Flickr30k微调及评估的脚本
#559
qyliuAI
opened
9 months ago
0
Update README.md
#558
eltociear
opened
10 months ago
0
How does one obtain training data suitable for fine-tuning Aquila models?
#557
Micla-SHL
closed
10 months ago
2
TypeError: AquilaPreTrainedModel._set_gradient_checkpointing() got an unexpected keyword argument 'enable'
#556
Chenjingliang1
closed
10 months ago
2
Can't run examples/vit_cifar100/train_deepspeed.py
#555
twmht
opened
11 months ago
1
[Question]: 想问下AquilaDecoderLayer里的传参为什么是float的?这样是用的float类型的attention
#554
Chenjingliang1
opened
11 months ago
2
torchrun command not found
#553
Micla-SHL
closed
11 months ago
5
[Question]: No module named 'packaging' This question is after install NVIDIA's apex
#552
Micla-SHL
closed
11 months ago
1
fixed yaml setup issue
#551
BAAI-OpenPlatform
closed
11 months ago
0
Collecting PyYAML==5.4.1 (from flagai)报错
#550
Darrenzeng
closed
11 months ago
4
运行对话脚本aquilachat2-34b-16k 报错误
#549
gaord
closed
8 months ago
1
[Question]: 使用flagai AutoLoader加载AquilaChat2-7B-16K,输入英文 输出乱码
#548
muzian666
closed
10 months ago
2
AquilaChat2 34B推理问题
#547
ningpengtao-coder
closed
11 months ago
1
Update conversation.py
#546
Anhforth
closed
11 months ago
0
updated version
#545
Anhforth
closed
11 months ago
0
changed aquila prediction
#544
Anhforth
closed
11 months ago
0
added quantize
#543
Anhforth
closed
12 months ago
0
add aquila sql
#542
xiaofengShi
closed
12 months ago
0
fixed bug
#541
Anhforth
closed
12 months ago
0
Added aquila2 finetuning
#540
Anhforth
closed
12 months ago
0
add flagai fit for aquila2
#539
Anhforth
closed
1 year ago
0
[Question]: How to train the altclip model
#538
gybwyy-art
opened
1 year ago
0
add aquila2 modeling file
#537
shunxing1234
closed
1 year ago
0
[Question]: altclip直接运行example的inference.py报错KeyError: 'text_config'
#536
susht3
closed
1 year ago
2
fix Update grads.py Torch does not support torch._six anymore
#535
fenglui
opened
1 year ago
0
[Question]: Model hub is not reachable!
#534
susht3
closed
11 months ago
4
[Question]: UnboundLocalError: local variable 'model_id' referenced before assignment
#533
susht3
closed
1 year ago
0
[Question]: modehub中的altclip与huggingface上的altclip参数不一致
#532
Jake-wei
closed
8 months ago
4
[Question]: Lora微调
#531
Bit-sjw
closed
11 months ago
3
add aquila flagai2hf weight script
#530
920232796
closed
1 year ago
0
Update docs
#529
Anhforth
closed
1 year ago
0
Update docs
#528
Anhforth
closed
1 year ago
0
怎么加载altclip模型?
#527
susht3
opened
1 year ago
6
[Question]: 如何进行模型并行训练
#526
kunden0612
closed
1 year ago
2
[Question]: 如何实现多语言tokenizer?
#525
BaiqingL
closed
1 year ago
1
updated docs
#524
Anhforth
closed
1 year ago
0
Fit torch
#523
Anhforth
closed
1 year ago
0
Next