mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-04 05:10:06 +06:00

* Add a script to check all models are tested and documented * Apply suggestions from code review Co-authored-by: Kevin Canwen Xu <canwenxu@126.com> * Address comments Co-authored-by: Kevin Canwen Xu <canwenxu@126.com>
147 lines
4.3 KiB
ReStructuredText
147 lines
4.3 KiB
ReStructuredText
XLM
|
|
----------------------------------------------------
|
|
|
|
Overview
|
|
~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
The XLM model was proposed in `Cross-lingual Language Model Pretraining <https://arxiv.org/abs/1901.07291>`_
|
|
by Guillaume Lample*, Alexis Conneau*. It's a transformer pre-trained using one of the following objectives:
|
|
|
|
- a causal language modeling (CLM) objective (next token prediction),
|
|
- a masked language modeling (MLM) objective (Bert-like), or
|
|
- a Translation Language Modeling (TLM) object (extension of Bert's MLM to multiple language inputs)
|
|
|
|
The abstract from the paper is the following:
|
|
|
|
*Recent studies have demonstrated the efficiency of generative pretraining for English natural language understanding.
|
|
In this work, we extend this approach to multiple languages and show the effectiveness of cross-lingual pretraining.
|
|
We propose two methods to learn cross-lingual language models (XLMs): one unsupervised that only relies on monolingual
|
|
data, and one supervised that leverages parallel data with a new cross-lingual language model objective. We obtain
|
|
state-of-the-art results on cross-lingual classification, unsupervised and supervised machine translation. On XNLI,
|
|
our approach pushes the state of the art by an absolute gain of 4.9% accuracy. On unsupervised machine translation,
|
|
we obtain 34.3 BLEU on WMT'16 German-English, improving the previous state of the art by more than 9 BLEU. On
|
|
supervised machine translation, we obtain a new state of the art of 38.5 BLEU on WMT'16 Romanian-English, outperforming
|
|
the previous best approach by more than 4 BLEU. Our code and pretrained models will be made publicly available.*
|
|
|
|
Tips:
|
|
|
|
- XLM has many different checkpoints, which were trained using different objectives: CLM, MLM or TLM. Make sure to
|
|
select the correct objective for your task (e.g. MLM checkpoints are not suitable for generation).
|
|
- XLM has multilingual checkpoints which leverage a specific `lang` parameter. Check out the
|
|
`multi-lingual <../multilingual.html>`__ page for more information.
|
|
|
|
The original code can be found `here <https://github.com/facebookresearch/XLM/>`_.
|
|
|
|
|
|
XLMConfig
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMConfig
|
|
:members:
|
|
|
|
XLMTokenizer
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMTokenizer
|
|
:members: build_inputs_with_special_tokens, get_special_tokens_mask,
|
|
create_token_type_ids_from_sequences, save_vocabulary
|
|
|
|
|
|
XLM specific outputs
|
|
~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.modeling_xlm.XLMForQuestionAnsweringOutput
|
|
:members:
|
|
|
|
|
|
XLMModel
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMModel
|
|
:members:
|
|
|
|
|
|
XLMWithLMHeadModel
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMWithLMHeadModel
|
|
:members:
|
|
|
|
|
|
XLMForSequenceClassification
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMForSequenceClassification
|
|
:members:
|
|
|
|
|
|
XLMForMultipleChoice
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMForMultipleChoice
|
|
:members:
|
|
|
|
|
|
XLMForTokenClassification
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMForTokenClassification
|
|
:members:
|
|
|
|
|
|
XLMForQuestionAnsweringSimple
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMForQuestionAnsweringSimple
|
|
:members:
|
|
|
|
|
|
XLMForQuestionAnswering
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.XLMForQuestionAnswering
|
|
:members:
|
|
|
|
|
|
TFXLMModel
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TFXLMModel
|
|
:members:
|
|
|
|
|
|
TFXLMWithLMHeadModel
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TFXLMWithLMHeadModel
|
|
:members:
|
|
|
|
|
|
TFXLMForSequenceClassification
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TFXLMForSequenceClassification
|
|
:members:
|
|
|
|
|
|
TFXLMForMultipleChoice
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TFXLMForMultipleChoice
|
|
:members:
|
|
|
|
|
|
TFXLMForTokenClassification
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TFXLMForTokenClassification
|
|
:members:
|
|
|
|
|
|
|
|
TFXLMForQuestionAnsweringSimple
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TFXLMForQuestionAnsweringSimple
|
|
:members:
|