transformers/docs/source/main_classes
Stas Bekman 7682e97702
[models] respect dtype of the model when instantiating it (#12316)
* [models] respect dtype of the model when instantiating it

* cleanup

* cleanup

* rework to handle non-float dtype

* fix

* switch to fp32 tiny model

* improve

* use dtype.is_floating_point

* Apply suggestions from code review

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

* fix the doc

* recode to use explicit torch_dtype_auto_detect, torch_dtype args

* docs and tweaks

* docs and tweaks

* docs and tweaks

* merge 2 args, add docs

* fix

* fix

* better doc

* better doc

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
2021-06-28 20:11:21 -07:00
..
callback.rst Add example for registering callbacks with trainers (#10928) 2021-04-05 12:27:23 -04:00
configuration.rst Copyright (#8970) 2020-12-07 18:36:34 -05:00
data_collator.rst Doc check: a bit of clean up (#11224) 2021-04-13 12:14:25 -04:00
deepspeed.rst [models] respect dtype of the model when instantiating it (#12316) 2021-06-28 20:11:21 -07:00
feature_extractor.rst Add ImageFeatureExtractionMixin (#10905) 2021-03-26 11:23:56 -04:00
logging.rst Logging propagation (#10092) 2021-02-09 10:27:49 -05:00
model.rst [models] respect dtype of the model when instantiating it (#12316) 2021-06-28 20:11:21 -07:00
optimizer_schedules.rst Seq2seq trainer (#9241) 2020-12-22 11:33:44 -05:00
output.rst update QuickTour docs to reflect model output object (#11462) 2021-04-26 22:18:37 -04:00
pipelines.rst Fix doc deployment 2021-05-13 10:34:14 -04:00
processors.rst Examples reorg (#11350) 2021-04-21 11:11:20 -04:00
tokenizer.rst Documentation about loading a fast tokenizer within Transformers (#11029) 2021-04-05 10:51:16 -04:00
trainer.rst remove extra white space from log format (#12360) 2021-06-25 13:20:14 -07:00