mirror of
https://github.com/huggingface/transformers.git
synced 2025-08-03 03:31:05 +06:00
Embed circle packing chart for model summary (#20791)
* embed circle packing chart * trim whitespace from bottom * explain bubble sizes
This commit is contained in:
parent
bd1a43b699
commit
3be028bc9d
@ -12,7 +12,12 @@ specific language governing permissions and limitations under the License.
|
||||
|
||||
# Summary of the models
|
||||
|
||||
This is a summary of the models available in 🤗 Transformers. It assumes you’re familiar with the original [transformer
|
||||
This is a summary of the most downloaded models in 🤗 Transformers. Click on the large outermost bubble of each pretrained model category (encoder, decoder, encoder-decoder) to zoom in and out to see the most popular models within a modality. The size of each bubble corresponds to the number of downloads of each model.
|
||||
|
||||
<iframe width="100%" height="900" frameborder="0"
|
||||
src="https://observablehq.com/embed/eafbe39385aaf8f2?cells=chart"></iframe>
|
||||
|
||||
It assumes you're familiar with the original [transformer
|
||||
model](https://arxiv.org/abs/1706.03762). For a gentle introduction check the [annotated transformer](http://nlp.seas.harvard.edu/2018/04/03/attention.html). Here we focus on the high-level differences between the
|
||||
models. You can check them more in detail in their respective documentation. Also check out [the Model Hub](https://huggingface.co/models) where you can filter the checkpoints by model architecture.
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user