transformers/docs/source/main_classes
Lysandre Debut 9f4e0c23d6
Documentation about loading a fast tokenizer within Transformers (#11029)
* Documentation about loading a fast tokenizer within Transformers

* Apply suggestions from code review

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

* style

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
2021-04-05 10:51:16 -04:00
..
callback.rst Copyright (#8970) 2020-12-07 18:36:34 -05:00
configuration.rst Copyright (#8970) 2020-12-07 18:36:34 -05:00
feature_extractor.rst Add ImageFeatureExtractionMixin (#10905) 2021-03-26 11:23:56 -04:00
logging.rst Logging propagation (#10092) 2021-02-09 10:27:49 -05:00
model.rst [Flax] Align FlaxBertForMaskedLM with BertForMaskedLM, implement from_pretrained, init (#9054) 2020-12-16 13:03:32 +01:00
optimizer_schedules.rst Seq2seq trainer (#9241) 2020-12-22 11:33:44 -05:00
output.rst Remove unsupported methods from ModelOutput doc (#10505) 2021-03-03 14:55:18 -05:00
pipelines.rst TableQuestionAnsweringPipeline (#9145) 2020-12-16 12:31:50 -05:00
processors.rst Fix documentation links always pointing to master. (#9217) 2021-01-05 06:18:48 -05:00
tokenizer.rst Documentation about loading a fast tokenizer within Transformers (#11029) 2021-04-05 10:51:16 -04:00
trainer.rst [Deepspeed] Allow HF optimizer and scheduler to be passed to deepspeed (#10464) 2021-03-16 15:51:09 -07:00