Embed circle packing chart for model summary (#20791)

* embed circle packing chart

* trim whitespace from bottom

* explain bubble sizes
This commit is contained in:
Steven Liu 2022-12-20 10:26:52 -08:00 committed by GitHub
parent bd1a43b699
commit 3be028bc9d
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -12,7 +12,12 @@ specific language governing permissions and limitations under the License.
# Summary of the models
This is a summary of the models available in 🤗 Transformers. It assumes youre familiar with the original [transformer
This is a summary of the most downloaded models in 🤗 Transformers. Click on the large outermost bubble of each pretrained model category (encoder, decoder, encoder-decoder) to zoom in and out to see the most popular models within a modality. The size of each bubble corresponds to the number of downloads of each model.
<iframe width="100%" height="900" frameborder="0"
src="https://observablehq.com/embed/eafbe39385aaf8f2?cells=chart"></iframe>
It assumes you're familiar with the original [transformer
model](https://arxiv.org/abs/1706.03762). For a gentle introduction check the [annotated transformer](http://nlp.seas.harvard.edu/2018/04/03/attention.html). Here we focus on the high-level differences between the
models. You can check them more in detail in their respective documentation. Also check out [the Model Hub](https://huggingface.co/models) where you can filter the checkpoints by model architecture.