mirror of
https://github.com/huggingface/transformers.git
synced 2025-08-02 19:21:31 +06:00
Fix error in doc of DataCollatorWithPadding
(#16662)
The defalut value of `padding` in `DataCollatorWithPadding` is `True`, not `False`.
This commit is contained in:
parent
9db2eebbe2
commit
5db2fcc61d
@ -220,11 +220,11 @@ class DataCollatorWithPadding:
|
||||
Select a strategy to pad the returned sequences (according to the model's padding side and padding index)
|
||||
among:
|
||||
|
||||
- `True` or `'longest'`: Pad to the longest sequence in the batch (or no padding if only a single sequence
|
||||
- `True` or `'longest'` (default): Pad to the longest sequence in the batch (or no padding if only a single sequence
|
||||
is provided).
|
||||
- `'max_length'`: Pad to a maximum length specified with the argument `max_length` or to the maximum
|
||||
acceptable input length for the model if that argument is not provided.
|
||||
- `False` or `'do_not_pad'` (default): No padding (i.e., can output a batch with sequences of different
|
||||
- `False` or `'do_not_pad'`: No padding (i.e., can output a batch with sequences of different
|
||||
lengths).
|
||||
max_length (`int`, *optional*):
|
||||
Maximum length of the returned list and optionally padding length (see above).
|
||||
|
Loading…
Reference in New Issue
Block a user