mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-06 06:10:04 +06:00

* first try * remove old template * finish bart * finish mbart * delete unnecessary line * init pegasus * save intermediate * correct pegasus * finish pegasus * remove cookie cutter leftover * add marian * finish blenderbot * replace in file * correctly split blenderbot * delete "old" folder * correct "add statement" * adapt config for tf comp * correct configs for tf * remove ipdb * fix more stuff * fix mbart * push pegasus fix * fix mbart * more fixes * fix research projects code * finish docs for bart, mbart, and marian * delete unnecessary file * correct attn typo * correct configs * remove pegasus for seq class * correct peg docs * correct peg docs * finish configs * further improve docs * add copied from statements to mbart * fix copied from in mbart * add copy statements to marian * add copied from to marian * add pegasus copied from * finish pegasus * finish copied from * Apply suggestions from code review * make style * backward comp blenderbot * apply lysandres and sylvains suggestions * apply suggestions * push last fixes * fix docs * fix tok tests * fix imports code style * fix doc
71 lines
3.7 KiB
ReStructuredText
71 lines
3.7 KiB
ReStructuredText
..
|
|
Copyright 2020 The HuggingFace Team. All rights reserved.
|
|
|
|
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
|
|
the License. You may obtain a copy of the License at
|
|
|
|
http://www.apache.org/licenses/LICENSE-2.0
|
|
|
|
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
|
|
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
|
|
specific language governing permissions and limitations under the License.
|
|
|
|
Blenderbot Small
|
|
-----------------------------------------------------------------------------------------------------------------------
|
|
|
|
Note that :class:`~transformers.BlenderbotSmallModel` and
|
|
:class:`~transformers.BlenderbotSmallForConditionalGeneration` are only used in combination with the checkpoint
|
|
`facebook/blenderbot-90M <https://huggingface.co/facebook/blenderbot-90M>`__. Larger Blenderbot checkpoints should
|
|
instead be used with :class:`~transformers.BlenderbotModel` and
|
|
:class:`~transformers.BlenderbotForConditionalGeneration`
|
|
|
|
Overview
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
The Blender chatbot model was proposed in `Recipes for building an open-domain chatbot
|
|
<https://arxiv.org/pdf/2004.13637.pdf>`__ Stephen Roller, Emily Dinan, Naman Goyal, Da Ju, Mary Williamson, Yinhan Liu,
|
|
Jing Xu, Myle Ott, Kurt Shuster, Eric M. Smith, Y-Lan Boureau, Jason Weston on 30 Apr 2020.
|
|
|
|
The abstract of the paper is the following:
|
|
|
|
*Building open-domain chatbots is a challenging area for machine learning research. While prior work has shown that
|
|
scaling neural models in the number of parameters and the size of the data they are trained on gives improved results,
|
|
we show that other ingredients are important for a high-performing chatbot. Good conversation requires a number of
|
|
skills that an expert conversationalist blends in a seamless way: providing engaging talking points and listening to
|
|
their partners, and displaying knowledge, empathy and personality appropriately, while maintaining a consistent
|
|
persona. We show that large scale models can learn these skills when given appropriate training data and choice of
|
|
generation strategy. We build variants of these recipes with 90M, 2.7B and 9.4B parameter models, and make our models
|
|
and code publicly available. Human evaluations show our best models are superior to existing approaches in multi-turn
|
|
dialogue in terms of engagingness and humanness measurements. We then discuss the limitations of this work by analyzing
|
|
failure cases of our models.*
|
|
|
|
The authors' code can be found `here <https://github.com/facebookresearch/ParlAI>`__ .
|
|
|
|
BlenderbotSmallConfig
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.BlenderbotSmallConfig
|
|
:members:
|
|
|
|
|
|
BlenderbotSmallTokenizer
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.BlenderbotSmallTokenizer
|
|
:members: build_inputs_with_special_tokens, get_special_tokens_mask,
|
|
create_token_type_ids_from_sequences, save_vocabulary
|
|
|
|
|
|
BlenderbotSmallModel
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.BlenderbotSmallModel
|
|
:members: forward
|
|
|
|
|
|
BlenderbotSmallForConditionalGeneration
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.BlenderbotSmallForConditionalGeneration
|
|
:members: forward
|