transformers/model_cards/DeepPavlov/rubert-base-cased-conversational
2020-07-15 18:59:20 +02:00
..
README.md [model_cards] Switch all languages codes to ISO-639-{1,2,3} 2020-07-15 18:59:20 +02:00

language
ru

rubert-base-cased-conversational

Conversational RuBERT Russian, cased, 12layer, 768hidden, 12heads, 180M parameters was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2]. We assembled a new vocabulary for Conversational RuBERT model on this data and initialized the model with RuBERT.

1
2