mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-30 17:52:35 +06:00
.. | ||
README.md |
language | ||||
---|---|---|---|---|
|
bert-base-bg-cs-pl-ru-cased
SlavicBERT[1] Slavic \(bg, cs, pl, ru
, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) was trained on Russian News and four Wikipedias: Bulgarian, Czech, Polish, and Russian. Subtoken vocabulary was built using this data. Multilingual BERT was used as an initialization for SlavicBERT.
1