mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
Fixes bug in the creation of ExponentialDecayLengthPenalty (#21423)
input_ids_seq_length doesn't exist in the GenerationConfig, it exists as local variable in the function. Setting exponential_decay_length_penalty therefore results in an error: `AttributeError: 'GenerationConfig' object has no attribute 'input_ids_seq_length'` This simple change fixes this issue, and the exponential_decay_length_penalty works as expected.
This commit is contained in:
parent
0a75717602
commit
6a3d1a98e0
@ -859,7 +859,7 @@ class GenerationMixin:
|
||||
ExponentialDecayLengthPenalty(
|
||||
generation_config.exponential_decay_length_penalty,
|
||||
generation_config.eos_token_id,
|
||||
generation_config.input_ids_seq_length,
|
||||
input_ids_seq_length,
|
||||
)
|
||||
)
|
||||
if generation_config.suppress_tokens is not None:
|
||||
|
Loading…
Reference in New Issue
Block a user