mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-24 14:58:56 +06:00
.. | ||
README.md |
language |
---|
en |
bert-base-cased-conversational
Conversational BERT English, cased, 12‑layer, 768‑hidden, 12‑heads, 110M parameters
was trained on the English part of Twitter, Reddit, DailyDialogues[1], OpenSubtitles[2], Debates[3], Blogs[4], Facebook News Comments. We used this training data to build the vocabulary of English subtokens and took English cased version of BERT‑base as an initialization for English Conversational BERT.
1
2
3
4