transformers/model_cards/Geotrend/bert-base-en-fr-cased/README.md
Amine Abdaoui eec76615f6
[model_cards]: control input examples of Geotrend models (#8727)
* [model_cards]: control arabic model examples

* [model_cards]: control input examples of Geotrend models

* [model_cards]: add link to generatation script
2020-11-23 11:09:50 -05:00

1.6 KiB

language datasets license widget
multilingual wikipedia apache-2.0
text
Google generated 46 billion [MASK] in revenue.
text
Paris is the capital of [MASK].
text
Algiers is the largest city in [MASK].
text
Paris est la [MASK] de la France.
text
Paris est la capitale de la [MASK].
text
L'élection américaine a eu [MASK] en novembre 2020.

bert-base-en-fr-cased

We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.

Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.

For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.

How to use

from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-fr-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-fr-cased")

To generate other smaller versions of multilingual transformers please visit our Github repo.

How to cite

@inproceedings{smallermbert,
  title={Load What You Need: Smaller Versions of Mutlilingual BERT},
  author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
  booktitle={SustaiNLP / EMNLP},
  year={2020}
}

Contact

Please contact amine@geotrend.fr for any question, feedback or request.