transformers/model_cards/DeepPavlov/bert-base-bg-cs-pl-ru-cased
2020-03-06 17:19:35 -05:00
..
README.md remove excess line breaks in DeepPavlov model cards 2020-03-06 17:19:35 -05:00

language
bulgarian
czech
polish
russian

bert-base-bg-cs-pl-ru-cased

SlavicBERT[1] Slavic \(bg, cs, pl, ru, cased, 12layer, 768hidden, 12heads, 180M parameters) was trained on Russian News and four Wikipedias: Bulgarian, Czech, Polish, and Russian. Subtoken vocabulary was built using this data. Multilingual BERT was used as an initialization for SlavicBERT.

1