issues
search
foundation-model-stack
/
fms-acceleration
🚀 Collection of libraries used with fms-hf-tuning to accelerate fine-tuning and training of large models.
Apache License 2.0
0
stars
4
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Add MultiRepo Build Packaging Flow and Release Upper Bounds
#45
fabianlim
closed
1 hour ago
0
Extract Out Model Patcher to Framework
#44
fabianlim
opened
5 days ago
0
Enable packaging CUDA wheels
#43
fabianlim
opened
5 days ago
1
Bound Package Versions
#42
achew010
closed
5 days ago
0
Upstream Main: Add Lora Dropout, YAML EnvVar Configuration
#41
fabianlim
closed
5 days ago
0
Allow Benchmarks to be run with configuration pointed to in YAML env variable
#40
fabianlim
closed
5 days ago
0
Enable CUDA Unit Tests in GH Workflows
#39
fabianlim
opened
1 week ago
0
Extract out AutoGPTQ dependency
#38
fabianlim
opened
1 week ago
2
Fused Ops Support for Lora Dropout
#37
fabianlim
closed
1 week ago
1
Added trl installation constraint to tox command
#36
achew010
closed
2 weeks ago
0
Upstream Main: Fused Ops and Kernels, FSDP and Memory Fixes
#35
fabianlim
closed
3 weeks ago
0
Updated benchmark reference
#34
achew010
closed
3 weeks ago
2
Support Position Ids in Rope
#33
fabianlim
opened
3 weeks ago
0
Allow Fused Ops to Support Dropout
#32
fabianlim
closed
1 week ago
0
Address Incorrect Ignoring of Base Layer Modules for FSDP with Kernels
#31
fabianlim
closed
3 weeks ago
0
Shift GPU Memory Computation to End of Benchmarking Script
#30
achew010
closed
3 weeks ago
0
Add MLP & QLoRA Fused Ops and Kernels, Mixtral
#29
fabianlim
closed
4 weeks ago
2
Fix FSDP casting issue for Autogptq and Fused Ops
#28
fabianlim
closed
1 month ago
1
Group memory field names with prefix and minor fixes
#27
achew010
closed
1 month ago
0
Workaround Low-Mem-Mode Patch for GPTQ-LoRA
#26
achew010
closed
1 month ago
1
Initial Addition of FusedOps and Kernels Plugin With Model Patcher
#25
fabianlim
closed
1 month ago
0
Linting and Formatting for FMS-Acceleration-Peft package
#24
achew010
closed
1 month ago
0
Linting and Formatting for FMS-Acceleration-Peft package
#23
achew010
closed
1 month ago
1
Upstream Main: Linting, Benchmarking, HF QLoRA baseline, FSDP fixes for GPTQ-LoRA
#22
fabianlim
closed
1 month ago
0
Revert "Upstream Main: Linting, Benchmarking, HF QLoRA baseline, FSDP fixes for GPTQ-LoRA"
#21
fabianlim
closed
1 month ago
0
Upstream Main: Linting, Benchmarking, HF QLoRA baseline, FSDP fixes for GPTQ-LoRA
#20
fabianlim
closed
1 month ago
0
Group Memory Field Names with Common Prefix
#19
achew010
closed
1 month ago
1
Allow AutoGPTQ to work in low cpu memory mode
#18
fabianlim
closed
1 month ago
1
Release Upper Limit on Torch, Transformers, Accelerate
#17
fabianlim
opened
1 month ago
2
Added support for running official HF baseline FSDP-QLoRA benchmark
#16
achew010
closed
1 month ago
0
Fix FSDP when performing GPTQ-LoRA with Triton V2
#15
fabianlim
closed
1 month ago
1
Provide Memory Benchmarking Feature to Benchmarking Code
#14
achew010
closed
1 month ago
7
Improvements to Benchmark Scripts and Config Generation Workflow
#13
fabianlim
closed
1 month ago
0
Memory Consumption for GPTQ-LoRA is higher than QLoRA in Distributed Finetuning
#12
achew010
closed
1 month ago
1
Readme Improvements
#11
fabianlim
closed
1 month ago
0
Allow BNB Plugin to be Loaded Without PEFT Wrapping
#10
fabianlim
closed
1 month ago
1
Finish Linting of Accelerated Peft Plugin and Scripts
#9
fabianlim
closed
1 week ago
1
Add GPU measurements to Benchmark Script
#8
fabianlim
closed
1 month ago
0
Add GitHub Workflow for Linting , Formatting and Test. Activate Workflow for Framework
#7
fabianlim
closed
1 month ago
0
Integration of Fused Modules and Kernel Enhancements
#6
achew010
closed
1 month ago
0
Add Configs and Arguments Listing
#5
fabianlim
closed
1 month ago
1
More README updates. Bench result updates and script Improvements. CLI docstring improvements.
#4
fabianlim
closed
1 month ago
0
Failure in FSDP Benchmark Experiment using QLoRA with Custom Fused Modules
#3
achew010
closed
1 week ago
6
Fix Issues With Benchmark Script
#2
fabianlim
closed
1 month ago
1
Add BenchMarking Script
#1
fabianlim
closed
1 month ago
0