transformers/model_cards/DeepPavlov/rubert-base-cased
2020-07-15 18:59:20 +02:00
..
README.md [model_cards] Switch all languages codes to ISO-639-{1,2,3} 2020-07-15 18:59:20 +02:00

language
ru

rubert-base-cased

RuBERT Russian, cased, 12layer, 768hidden, 12heads, 180M parameters was trained on the Russian part of Wikipedia and news data. We used this training data to build a vocabulary of Russian subtokens and took a multilingual version of BERTbase as an initialization for RuBERT[1].

1