mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-04 05:10:06 +06:00
docs: add LLaMA-Efficient-Tuning to awesome-transformers (#25441)
Co-authored-by: statelesshz <jihuazhong1@huawei.com>
This commit is contained in:
parent
a7da2996a0
commit
347001237a
@ -601,3 +601,9 @@ All Hugging Face models and pipelines can be seamlessly integrated into BentoML
|
||||
|
||||
Keywords: BentoML, Framework, Deployment, AI Applications
|
||||
|
||||
## [LLaMA-Efficient-Tuning](https://github.com/hiyouga/LLaMA-Efficient-Tuning)
|
||||
|
||||
[LLaMA-Efficient-Tuning](https://github.com/hiyouga/LLaMA-Efficient-Tuning) offers a user-friendly fine-tuning framework that incorporates PEFT. The repository includes training(fine-tuning) and inference examples for LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, and other LLMs. A ChatGLM version is also available in [ChatGLM-Efficient-Tuning](https://github.com/hiyouga/ChatGLM-Efficient-Tuning).
|
||||
|
||||
Keywords: PEFT, fine-tuning, LLaMA-2, ChatGLM, Qwen
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user