transformers/model_cards/DeepPavlov/rubert-base-cased-conversational
2020-03-06 17:19:35 -05:00
..
README.md remove excess line breaks in DeepPavlov model cards 2020-03-06 17:19:35 -05:00

language
russian

rubert-base-cased-conversational

Conversational RuBERT Russian, cased, 12layer, 768hidden, 12heads, 180M parameters was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2]. We assembled a new vocabulary for Conversational RuBERT model on this data and initialized the model with RuBERT.

1
2