mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
🖍 remove broken link (#15615)
This commit is contained in:
parent
2f40c728c9
commit
85aee09e9a
@ -38,10 +38,10 @@ Tips:
|
||||
- CTRL was trained with a causal language modeling (CLM) objective and is therefore powerful at predicting the next
|
||||
token in a sequence. Leveraging this feature allows CTRL to generate syntactically coherent text as it can be
|
||||
observed in the *run_generation.py* example script.
|
||||
- The PyTorch models can take the *past* as input, which is the previously computed key/value attention pairs. Using
|
||||
this *past* value prevents the model from re-computing pre-computed values in the context of text generation. See
|
||||
[reusing the past in generative models](../quickstart#using-the-past) for more information on the usage of
|
||||
this argument.
|
||||
- The PyTorch models can take the `past_key_values` as input, which is the previously computed key/value attention pairs.
|
||||
TensorFlow models accepts `past` as input. Using the `past_key_values` value prevents the model from re-computing
|
||||
pre-computed values in the context of text generation. See the [`forward`](model_doc/ctrl#transformers.CTRLModel.forward)
|
||||
method for more information on the usage of this argument.
|
||||
|
||||
This model was contributed by [keskarnitishr](https://huggingface.co/keskarnitishr). The original code can be found
|
||||
[here](https://github.com/salesforce/ctrl).
|
||||
|
Loading…
Reference in New Issue
Block a user