Fix None in add_token_positions - issue #10210 (#10374)

* Fix None in add_token_positions - issue #10210

Fix None in add_token_positions related to the issue #10210

* add_token_positions fix None values in end_positions vector

add_token_positions fix None in end_positions vector as proposed by @joeddav
This commit is contained in:
Andrea Bacciu 2021-02-25 17:18:33 +01:00 committed by GitHub
parent 9d14be5c20
commit b040e6efc1
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -563,6 +563,7 @@ we can use the built in :func:`~transformers.BatchEncoding.char_to_token` method
# if start position is None, the answer passage has been truncated
if start_positions[-1] is None:
start_positions[-1] = tokenizer.model_max_length
if end_positions[-1] is None:
end_positions[-1] = tokenizer.model_max_length
encodings.update({'start_positions': start_positions, 'end_positions': end_positions})