transformers/docs/source/en/model_doc/longformer.md

7.8 KiB
Raw Blame History

PyTorch TensorFlow

Longformer

Longformer is a transformer model designed for processing long documents. The self-attention operation usually scales quadratically with sequence length, preventing transformers from processing longer sequences. The Longformer attention mechanism overcomes this by scaling linearly with sequence length. It combines local windowed attention with task-specific global attention, enabling efficient processing of documents with thousands of tokens.

You can find all the original Longformer checkpoints under the Ai2 organization.

Tip

Click on the Longformer models in the right sidebar for more examples of how to apply Longformer to different language tasks.

The example below demonstrates how to fill the <mask> token with [Pipeline], [AutoModel] and from the command line.

import torch
from transformers import pipeline

pipeline = pipeline(
    task="fill-mask",
    model="allenai/longformer-base-4096",
    torch_dtype=torch.float16,
    device=0
)
pipeline("""San Francisco 49ers cornerback Shawntae Spencer will miss the rest of the <mask> with a torn ligament in his left knee.
Spencer, a fifth-year pro, will be placed on injured reserve soon after undergoing surgery Wednesday to repair the ligament. He injured his knee late in the 49ers road victory at Seattle on Sept. 14, and missed last weeks victory over Detroit.
Tarell Brown and Donald Strickland will compete to replace Spencer with the 49ers, who kept 12 defensive backs on their 53-man roster to start the season. Brown, a second-year pro, got his first career interception last weekend while filling in for Strickland, who also sat out with a knee injury.""")
import torch
from transformers import AutoModelForMaskedLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("allenai/longformer-base-4096")
model = AutoModelForMaskedLM.from_pretrained("allenai/longformer-base-4096")

text = (
"""
San Francisco 49ers cornerback Shawntae Spencer will miss the rest of the <mask> with a torn ligament in his left knee.
Spencer, a fifth-year pro, will be placed on injured reserve soon after undergoing surgery Wednesday to repair the ligament. He injured his knee late in the 49ers road victory at Seattle on Sept. 14, and missed last weeks victory over Detroit.
Tarell Brown and Donald Strickland will compete to replace Spencer with the 49ers, who kept 12 defensive backs on their 53-man roster to start the season. Brown, a second-year pro, got his first career interception last weekend while filling in for Strickland, who also sat out with a knee injury.
"""
)

input_ids = tokenizer([text], return_tensors="pt")["input_ids"]
logits = model(input_ids).logits

masked_index = (input_ids[0] == tokenizer.mask_token_id).nonzero().item()
probs = logits[0, masked_index].softmax(dim=0)
values, predictions = probs.topk(5)
tokenizer.decode(predictions).split()
echo -e "San Francisco 49ers cornerback Shawntae Spencer will miss the rest of the <mask> with a torn ligament in his left knee." | transformers run --task fill-mask --model allenai/longformer-base-4096 --device 0

Notes

  • Longformer is based on RoBERTa and doesn't have token_type_ids. You don't need to indicate which token belongs to which segment. You only need to separate the segments with the separation token </s> or tokenizer.sep_token.

  • You can set which tokens can attend locally and which tokens attend globally with the global_attention_mask at inference (see this example for more details). A value of 0 means a token attends locally and a value of 1 means a token attends globally.

  • [LongformerForMaskedLM] is trained like [RobertaForMaskedLM] and should be used as shown below.

      input_ids = tokenizer.encode("This is a sentence from [MASK] training data", return_tensors="pt")
      mlm_labels = tokenizer.encode("This is a sentence from the training data", return_tensors="pt")
      loss = model(input_ids, labels=input_ids, masked_lm_labels=mlm_labels)[0]
    

LongformerConfig

autodoc LongformerConfig

LongformerTokenizer

autodoc LongformerTokenizer

LongformerTokenizerFast

autodoc LongformerTokenizerFast

Longformer specific outputs

autodoc models.longformer.modeling_longformer.LongformerBaseModelOutput

autodoc models.longformer.modeling_longformer.LongformerBaseModelOutputWithPooling

autodoc models.longformer.modeling_longformer.LongformerMaskedLMOutput

autodoc models.longformer.modeling_longformer.LongformerQuestionAnsweringModelOutput

autodoc models.longformer.modeling_longformer.LongformerSequenceClassifierOutput

autodoc models.longformer.modeling_longformer.LongformerMultipleChoiceModelOutput

autodoc models.longformer.modeling_longformer.LongformerTokenClassifierOutput

autodoc models.longformer.modeling_tf_longformer.TFLongformerBaseModelOutput

autodoc models.longformer.modeling_tf_longformer.TFLongformerBaseModelOutputWithPooling

autodoc models.longformer.modeling_tf_longformer.TFLongformerMaskedLMOutput

autodoc models.longformer.modeling_tf_longformer.TFLongformerQuestionAnsweringModelOutput

autodoc models.longformer.modeling_tf_longformer.TFLongformerSequenceClassifierOutput

autodoc models.longformer.modeling_tf_longformer.TFLongformerMultipleChoiceModelOutput

autodoc models.longformer.modeling_tf_longformer.TFLongformerTokenClassifierOutput

LongformerModel

autodoc LongformerModel - forward

LongformerForMaskedLM

autodoc LongformerForMaskedLM - forward

LongformerForSequenceClassification

autodoc LongformerForSequenceClassification - forward

LongformerForMultipleChoice

autodoc LongformerForMultipleChoice - forward

LongformerForTokenClassification

autodoc LongformerForTokenClassification - forward

LongformerForQuestionAnswering

autodoc LongformerForQuestionAnswering - forward

TFLongformerModel

autodoc TFLongformerModel - call

TFLongformerForMaskedLM

autodoc TFLongformerForMaskedLM - call

TFLongformerForQuestionAnswering

autodoc TFLongformerForQuestionAnswering - call

TFLongformerForSequenceClassification

autodoc TFLongformerForSequenceClassification - call

TFLongformerForTokenClassification

autodoc TFLongformerForTokenClassification - call

TFLongformerForMultipleChoice

autodoc TFLongformerForMultipleChoice - call