mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
parent
0224aaf67f
commit
06b05d4575
@ -68,7 +68,8 @@ class AggregationStrategy(ExplicitEnum):
|
||||
same entity together in the predictions or not.
|
||||
stride (`int`, *optional*):
|
||||
If stride is provided, the pipeline is applied on all the text. The text is split into chunks of size
|
||||
model_max_length. Works only with fast tokenizers and `aggregation_strategy` different from `NONE`.
|
||||
model_max_length. Works only with fast tokenizers and `aggregation_strategy` different from `NONE`. The
|
||||
value of this argument defines the number of overlapping tokens between chunks.
|
||||
aggregation_strategy (`str`, *optional*, defaults to `"none"`):
|
||||
The strategy to fuse (or not) tokens based on the model prediction.
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user