mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
.. | ||
README.md |
language | |
---|---|
|
rubert-base-cased-conversational
Conversational RuBERT Russian, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters
was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2]. We assembled a new vocabulary for Conversational RuBERT model on this data and initialized the model with RuBERT.
1
2