mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-06 14:20:04 +06:00
70 lines
2.4 KiB
ReStructuredText
70 lines
2.4 KiB
ReStructuredText
Transformer XL
|
|
----------------------------------------------------
|
|
|
|
The Transformer-XL model was proposed in
|
|
`Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context`_
|
|
by Zihang Dai*, Zhilin Yang*, Yiming Yang, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov.
|
|
It's a causal (uni-directional) transformer with relative positioning (sinusoïdal) embeddings which can reuse
|
|
previously computed hidden-states to attend to longer context (memory).
|
|
This model also uses adaptive softmax inputs and outputs (tied).
|
|
|
|
The abstract from the paper is the following:
|
|
|
|
*Transformers have a potential of learning longer-term dependency, but are limited by a fixed-length context in the
|
|
setting of language modeling. We propose a novel neural architecture Transformer-XL that enables learning dependency
|
|
beyond a fixed length without disrupting temporal coherence. It consists of a segment-level recurrence mechanism and
|
|
a novel positional encoding scheme. Our method not only enables capturing longer-term dependency, but also resolves
|
|
the context fragmentation problem. As a result, Transformer-XL learns dependency that is 80% longer than RNNs and
|
|
450% longer than vanilla Transformers, achieves better performance on both short and long sequences, and is up
|
|
to 1,800+ times faster than vanilla Transformers during evaluation. Notably, we improve the state-of-the-art results
|
|
of bpc/perplexity to 0.99 on enwiki8, 1.08 on text8, 18.3 on WikiText-103, 21.8 on One Billion Word, and 54.5 on
|
|
Penn Treebank (without finetuning). When trained only on WikiText-103, Transformer-XL manages to generate reasonably
|
|
coherent, novel text articles with thousands of tokens.*
|
|
|
|
Tips:
|
|
|
|
- Transformer-XL uses relative sinusoidal positional embeddings so it's usually advised to pad the inputs on
|
|
the left rather than the right.
|
|
|
|
|
|
``TransfoXLConfig``
|
|
~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TransfoXLConfig
|
|
:members:
|
|
|
|
|
|
``TransfoXLTokenizer``
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TransfoXLTokenizer
|
|
:members:
|
|
|
|
|
|
``TransfoXLModel``
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TransfoXLModel
|
|
:members:
|
|
|
|
|
|
``TransfoXLLMHeadModel``
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TransfoXLLMHeadModel
|
|
:members:
|
|
|
|
|
|
``TFTransfoXLModel``
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TFTransfoXLModel
|
|
:members:
|
|
|
|
|
|
``TFTransfoXLLMHeadModel``
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.TFTransfoXLLMHeadModel
|
|
:members:
|