issues
search
hpcaitech
/
ColossalAI
Making large AI models cheaper, faster and more accessible
https://www.colossalai.org
Apache License 2.0
38.69k
stars
4.34k
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
[pre-commit.ci] pre-commit autoupdate
#6078
pre-commit-ci[bot]
opened
1 day ago
0
[Feature] ZeroBubble support MoeHybridplugin;
#6077
duanjunwen
opened
2 days ago
0
FasterMoE shadow expert implement
#6076
Guodanding
opened
2 days ago
0
[zerobubble] rebase main
#6075
flybird11111
opened
4 days ago
0
[zerobubble] rebase main
#6074
flybird11111
closed
4 days ago
0
[doc] sequence parallel document
#6073
wangbluo
opened
4 days ago
0
add funding news
#6072
binmakeswell
closed
5 days ago
0
[Fix] fix the 2d ring attn when using multiple machine
#6071
wangbluo
opened
6 days ago
0
[Fix] fix the 2d ring attn when using multiple machine
#6070
wangbluo
closed
6 days ago
0
[HotFix] Fix stage_index in zerobubble test;
#6069
duanjunwen
closed
4 days ago
0
release FP8 news
#6068
binmakeswell
closed
6 days ago
0
2024新疆地区可用机场名单
#6067
swhmy
closed
6 days ago
1
[DOC]: 环境安装失败
#6066
eccct
opened
1 week ago
16
[Feat] Support zero bubble with shardformer input
#6065
duanjunwen
closed
1 week ago
0
[sp] : fix the attention kernel for sp
#6064
wangbluo
closed
1 week ago
0
[moe] add parallel strategy for shared_expert && fix test for deepseek
#6063
botbw
closed
1 week ago
1
[release] update version
#6062
ver217
closed
1 week ago
0
[sp] : fix the attention kernel for sp
#6061
wangbluo
closed
2 weeks ago
0
[plugin]hybrid support zero bubble pipeline
#6060
flybird11111
closed
4 days ago
0
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8
#6059
GuangyaoZhang
closed
2 weeks ago
0
[BUG]: Disable all_gather intranode. Disable Redundant all_gather fp8
#6058
GuangyaoZhang
closed
2 weeks ago
0
[fp8] fix missing fp8_comm flag in mixtral
#6057
botbw
closed
2 weeks ago
0
[ColossalEval] support for vllm
#6056
Camille7777
closed
1 week ago
2
[shardformer]update sp doc
#6055
flybird11111
closed
2 weeks ago
0
[Coati] Train DPO using PP
#6054
TongLi3701
opened
3 weeks ago
0
[fp8] hotfix backward hook
#6053
ver217
closed
2 weeks ago
0
[Coati] Support PP for DPO training
#6052
TongLi3701
closed
3 weeks ago
0
[doc] FP8 training and communication document
#6050
GuangyaoZhang
closed
2 weeks ago
0
[DOC]: Add document for FP8 training and communication
#6049
GuangyaoZhang
closed
2 weeks ago
0
[hotfix] moe hybrid parallelism benchmark & follow-up fix
#6048
botbw
closed
3 weeks ago
0
[FEATURE]: Is it Possible to integrate Liger-Kernel?
#6047
ericxsun
opened
3 weeks ago
4
[fp8] fix linear hook
#6046
ver217
closed
4 weeks ago
0
disable all_to_all_fp8 in intranode
#6045
BurkeHulk
closed
3 weeks ago
0
[ci] Remove triton cache in compatibility tests
#6044
yuanheng-zhao
opened
4 weeks ago
0
[fp8] optimize all-gather
#6043
ver217
closed
4 weeks ago
0
[Hotfix] Remove deprecated install
#6042
TongLi3701
closed
4 weeks ago
0
[release] update version
#6041
ver217
closed
3 weeks ago
0
[FP8] Unsqueeze scale to make it compatible with torch.compile
#6040
GuangyaoZhang
closed
1 month ago
0
[BUG]: remove `.github/workflows/submodule.yml`
#6039
BoxiangW
opened
1 month ago
0
[Hotfix] Auto Fused Norm
#6038
TongLi3701
closed
1 month ago
0
[FEATURE]: Support Zerobubble pipeline
#6037
duanjunwen
opened
1 month ago
0
[plugin] hotfix zero plugin
#6036
ver217
closed
1 month ago
0
[zerobubble] support distributed layers for zero bubble v scheduler.
#6035
flybird11111
opened
1 month ago
0
[zerobubble]Support ZeroBubble Pipeline
#6034
duanjunwen
closed
3 weeks ago
0
[fp8] fix the merge
#6033
wangbluo
closed
1 month ago
0
[BUG]: errror Colossalai 0.4.0/0.4.2 /usr/bin/supervisord
#6032
Storm0921
opened
1 month ago
2
[Hotfix] Fix llama fwd replacement bug
#6031
Edenzzzz
closed
1 month ago
0
[Colossal-LLaMA] Refactor latest APIs
#6030
TongLi3701
closed
1 month ago
0
Update train_dpo.py
#6029
flybird11111
closed
1 month ago
0
如何同时训练两个模型?
#6028
wangqiang9
closed
1 month ago
4
Next