mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
Add Falcon3 documentation (#35307)
* Add Falcon3 documentation * Update Falcon3 documentation * Change Falcon to Falcon3 * Update docs and run make fix-copies * Add blog post and huggingface models links
This commit is contained in:
parent
f33a0cebb3
commit
6c08b3b6e5
@ -396,6 +396,8 @@
|
||||
title: ESM
|
||||
- local: model_doc/falcon
|
||||
title: Falcon
|
||||
- local: model_doc/falcon3
|
||||
title: Falcon3
|
||||
- local: model_doc/falcon_mamba
|
||||
title: FalconMamba
|
||||
- local: model_doc/fastspeech2_conformer
|
||||
|
@ -141,6 +141,7 @@ Flax), PyTorch, and/or TensorFlow.
|
||||
| [ESM](model_doc/esm) | ✅ | ✅ | ❌ |
|
||||
| [FairSeq Machine-Translation](model_doc/fsmt) | ✅ | ❌ | ❌ |
|
||||
| [Falcon](model_doc/falcon) | ✅ | ❌ | ❌ |
|
||||
| [Falcon3](model_doc/falcon3) | ✅ | ❌ | ✅ |
|
||||
| [FalconMamba](model_doc/falcon_mamba) | ✅ | ❌ | ❌ |
|
||||
| [FastSpeech2Conformer](model_doc/fastspeech2_conformer) | ✅ | ❌ | ❌ |
|
||||
| [FLAN-T5](model_doc/flan-t5) | ✅ | ✅ | ✅ |
|
||||
|
29
docs/source/en/model_doc/falcon3.md
Normal file
29
docs/source/en/model_doc/falcon3.md
Normal file
@ -0,0 +1,29 @@
|
||||
<!--Copyright 2024 The HuggingFace Team. All rights reserved.
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
|
||||
the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
|
||||
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations under the License.
|
||||
|
||||
⚠️ Note that this file is in Markdown but contain specific syntax for our doc-builder (similar to MDX) that may not be
|
||||
rendered properly in your Markdown viewer.
|
||||
|
||||
-->
|
||||
|
||||
# Falcon3
|
||||
|
||||
## Overview
|
||||
|
||||
Falcon3 represents a natural evolution from previous releases, emphasizing expanding the models' science, math, and code capabilities. This iteration includes five base models: Falcon3-1B-Base, Falcon3-3B-Base, Falcon3-Mamba-7B-Base, Falcon3-7B-Base, and Falcon3-10B-Base. In developing these models, we incorporated several key innovations aimed at improving the models' performances while reducing training costs:
|
||||
|
||||
One pre-training: We conducted a single large-scale pretraining run on the 7B model, using 2048 H100 GPU chips, leveraging 14 trillion tokens featuring web, code, STEM, and curated high-quality and multilingual data.
|
||||
Depth up-scaling for improved reasoning: Building on recent studies on the effects of model depth, we upscaled the 7B model to a 10B parameters model by duplicating the redundant layers and continuing pre-training with 2TT of high-quality data. This yielded Falcon3-10B-Base which achieves state-of-the-art zero-shot and few-shot performance for models under 13B parameters.
|
||||
Knowledge distillation for better tiny models: To provide compact and efficient alternatives, we developed Falcon3-1B-Base and Falcon3-3B-Base by leveraging pruning and knowledge distillation techniques, using less than 100GT of curated high-quality data, thereby redefining pre-training efficiency.
|
||||
|
||||
## Resources
|
||||
- [Blog post](https://huggingface.co/blog/falcon3)
|
||||
- [Models on Huggingface](https://huggingface.co/collections/tiiuae/falcon3-67605ae03578be86e4e87026)
|
@ -415,6 +415,7 @@ MODEL_NAMES_MAPPING = OrderedDict(
|
||||
("ernie_m", "ErnieM"),
|
||||
("esm", "ESM"),
|
||||
("falcon", "Falcon"),
|
||||
("falcon3", "Falcon3"),
|
||||
("falcon_mamba", "FalconMamba"),
|
||||
("fastspeech2_conformer", "FastSpeech2Conformer"),
|
||||
("flan-t5", "FLAN-T5"),
|
||||
|
@ -157,6 +157,7 @@ MODEL_NAMES_WITH_SAME_CONFIG = {
|
||||
"LayoutXLM": "LayoutLMv2",
|
||||
"Llama2": "LLaMA",
|
||||
"Llama3": "LLaMA",
|
||||
"Falcon3": "LLaMA",
|
||||
"MADLAD-400": "T5",
|
||||
"MatCha": "Pix2Struct",
|
||||
"mBART-50": "mBART",
|
||||
|
Loading…
Reference in New Issue
Block a user