transformers/model_cards/DeepPavlov/rubert-base-cased
2020-03-06 17:19:35 -05:00
..
README.md remove excess line breaks in DeepPavlov model cards 2020-03-06 17:19:35 -05:00

language
russian

rubert-base-cased

RuBERT Russian, cased, 12layer, 768hidden, 12heads, 180M parameters was trained on the Russian part of Wikipedia and news data. We used this training data to build a vocabulary of Russian subtokens and took a multilingual version of BERTbase as an initialization for RuBERT[1].

1