transformers/model_cards/DeepPavlov/bert-base-cased-conversational
2020-07-15 18:59:20 +02:00
..
README.md [model_cards] Switch all languages codes to ISO-639-{1,2,3} 2020-07-15 18:59:20 +02:00

language
en

bert-base-cased-conversational

Conversational BERT English, cased, 12layer, 768hidden, 12heads, 110M parameters was trained on the English part of Twitter, Reddit, DailyDialogues[1], OpenSubtitles[2], Debates[3], Blogs[4], Facebook News Comments. We used this training data to build the vocabulary of English subtokens and took English cased version of BERTbase as an initialization for English Conversational BERT.

1
2
3
4