From 6a5fd0c6d284dbfeb6af8b9cca3d6a7a81e5630f Mon Sep 17 00:00:00 2001 From: Lysandre Debut Date: Thu, 12 Jun 2025 15:43:31 +0200 Subject: [PATCH] Reword README in light of model definitions (#38762) * Slight readme reword * reword * reword * reword * Slight readme reword --- README.md | 16 +++++++++++++--- docs/source/en/index.md | 20 ++++++++++++++++++-- 2 files changed, 31 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 00f90f7d6c8..13afb112caa 100644 --- a/README.md +++ b/README.md @@ -59,12 +59,22 @@ limitations under the License.

- +

-Transformers is a library of pretrained text, computer vision, audio, video, and multimodal models for inference and training. Use Transformers to fine-tune models on your data, build inference applications, and for generative AI use cases across multiple modalities. -There are over 500K+ Transformers [model checkpoints](https://huggingface.co/models?library=transformers&sort=trending) on the [Hugging Face Hub](https://huggingface.com/models) you can use. +Transformers acts as the model-definition framework for state-of-the-art machine learning models in text, computer +vision, audio, video, and multimodal model, for both inference and training. + +It centralizes the model definition so that this definition is agreed upon across the ecosystem. `transformers` is the +pivot across frameworks: if a model definition is supported, it will be compatible with the majority of training +frameworks (Axolotl, Unsloth, DeepSpeed, FSDP, PyTorch-Lightning, ...), inference engines (vLLM, SGLang, TGI, ...), +and adjacent modeling libraries (llama.cpp, mlx, ...) which leverage the model definition from `transformers`. + +We pledge to help support new state-of-the-art models and democratize their usage by having their model definition be +simple, customizable, and efficient. + +There are over 1M+ Transformers [model checkpoints](https://huggingface.co/models?library=transformers&sort=trending) on the [Hugging Face Hub](https://huggingface.com/models) you can use. Explore the [Hub](https://huggingface.com/) today to find a model and use Transformers to help you get started right away. diff --git a/docs/source/en/index.md b/docs/source/en/index.md index 5c3898ce788..b2b826c445c 100644 --- a/docs/source/en/index.md +++ b/docs/source/en/index.md @@ -15,9 +15,25 @@ rendered properly in your Markdown viewer. # Transformers -Transformers is a library of pretrained natural language processing, computer vision, audio, and multimodal models for inference and training. Use Transformers to train models on your data, build inference applications, and generate text with large language models. +

+ +

-Explore the [Hugging Face Hub](https://huggingface.com) today to find a model and use Transformers to help you get started right away. + +Transformers acts as the model-definition framework for state-of-the-art machine learning models in text, computer +vision, audio, video, and multimodal model, for both inference and training. + +It centralizes the model definition so that this definition is agreed upon across the ecosystem. `transformers` is the +pivot across frameworks: if a model definition is supported, it will be compatible with the majority of training +frameworks (Axolotl, Unsloth, DeepSpeed, FSDP, PyTorch-Lightning, ...), inference engines (vLLM, SGLang, TGI, ...), +and adjacent modeling libraries (llama.cpp, mlx, ...) which leverage the model definition from `transformers`. + +We pledge to help support new state-of-the-art models and democratize their usage by having their model definition be +simple, customizable, and efficient. + +There are over 1M+ Transformers [model checkpoints](https://huggingface.co/models?library=transformers&sort=trending) on the [Hugging Face Hub](https://huggingface.com/models) you can use. + +Explore the [Hub](https://huggingface.com/) today to find a model and use Transformers to help you get started right away. ## Features