Open chenglu opened 1 year ago
发布日期 | 博客标题 | 源文件 | 中文翻译文档链接 | 认领人 |
---|---|---|---|---|
2023-04-13 | Train and Deploy BLOOM with Amazon SageMaker and PEFT | GitHub | 2023-04-13-bloom-sagemaker-peft.ipynb | xiaodouzi666 (已认领) |
2023-04-04 | Introducing IGEL an instruction-tuned German large Language Model | 无 | 2023-04-04-introducing-igel.md | |
2023-03-23 | Efficient Large Language Model training with LoRA and Hugging Face | GitHub | 2023-03-23-fine-tune-flan-t5-peft.ipynb | Matrix Yao (已发布) |
2023-03-20 | Deploy FLAN-UL2 20B on Amazon SageMaker | GitHub | 2023-03-20-deploy-flan-ul2-sagemaker.ipynb | |
2023-03-16 | Getting started with Pytorch 2.0 and Hugging Face Transformer | GitHub | 2023-03-16-getting-started-pytorch-2-0-transformers.ipynb | VermillionDe (已认领) |
2023-03-03 | Controlled text-to-image generation with ControlNet on Inference Endpoints | 无 | 2023-03-03-stable-diffusion-controlnet-endpoint.md | VermillionDe (已认领) |
2023-02-22 | Combine Amazon SageMaker and DeepSpeed to fine-tune FLAN-T5 XXL | GitHub | 2023-02-22-sagemaker-deepspeed.ipynb | |
2023-02-16 | Fine-tune FLAN-T5 XL/XXL using DeepSpeed & Hugging Face Transformers | GitHub | 2023-02-16-fine-tune-flan-t5-deepspeed.ipynb | Matrix Yao (已完成) |
2023-02-08 | Deploy FLAN-T5 XXL on Amazon SageMaker | GitHub | 2023-02-08-deploy-flan-t5-sagemaker.ipynb | |
2023-01-26 | Hugging Face Transformers Examples | 无 | 2023-01-26-huggingface-transformers-examples.md | |
2023-01-16 | Getting started with Transformers and TPU using PyTorch | GitHub | 2023-01-16-getting-started-tpu-transformers.ipynb |
New entry: https://www.philschmid.de/bloom-sagemaker-peft (April 13) @chenglu
认领 https://huggingface.co/blog/4bit-transformers-bitsandbytes PRing @chenglu