issues
search
kathrinse
/
be_great
A novel approach for synthesizing tabular data using pretrained large language models
MIT License
254
stars
43
forks
source link
adapt lora with distilgpt2 model on GReaT
#20
Closed
zhao-zilong
closed
1 year ago
zhao-zilong
commented
1 year ago
Incorporate LoRA into GReaT to reduce memory usage.
Incorporate LoRA into GReaT to reduce memory usage.