Handle padding warning in generation when using inputs_embeds (#23131)

* Handle padding warning in generation when using `inputs_embeds`

* Simpler condition

* Black formatter

* Changed warning logic
This commit is contained in:
Alisamar Husain 2023-05-12 21:36:15 +05:30 committed by GitHub
parent 65d7b21b77
commit 291c5e9b25
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -1307,8 +1307,11 @@ class GenerationMixin:
# decoder-only models should use left-padding for generation
if not self.config.is_encoder_decoder:
# If `input_ids` was given, check if the last id in any sequence is `pad_token_id`
# Note: If using, `inputs_embeds` this check does not work, because we want to be more hands-off.
if (
generation_config.pad_token_id is not None
and len(inputs_tensor.shape) == 2
and torch.sum(inputs_tensor[:, -1] == generation_config.pad_token_id) > 0
):
logger.warning(