transformers/tests
Zach Mueller 6757ed28ce
Allow resume_from_checkpoint to handle auto_find_batch_size (#27568)
* Fuffill request

* Add test

* Better test

* Apply suggestions from code review

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>

* Better test

* Better test

* MOre comments

---------

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
2023-12-08 11:51:02 -05:00
..
benchmark [Test refactor 1/5] Per-folder tests reorganization (#15725) 2022-02-23 15:46:28 -05:00
bettertransformer Fixed malapropism error (#26660) 2023-10-09 11:04:57 +02:00
deepspeed device-agnostic deepspeed testing (#27342) 2023-11-09 12:34:13 +01:00
extended Device agnostic trainer testing (#27131) 2023-10-30 18:16:40 +00:00
fixtures [WIP] add SpeechT5 model (#18922) 2023-02-03 12:43:46 -05:00
fsdp device agnostic fsdp testing (#27120) 2023-11-01 07:17:06 +01:00
generation Fix remaining issues in beam score calculation (#27808) 2023-12-08 14:14:16 +01:00
models mark test_initialization as flaky in 2 model tests (#27906) 2023-12-08 14:54:32 +01:00
optimization Make schedulers picklable by making lr_lambda fns global (#21768) 2023-03-02 12:08:43 -05:00
peft_integration [Peft] modules_to_save support for peft integration (#27466) 2023-11-14 10:32:57 +01:00
pipelines Fix 2 tests in FillMaskPipelineTests (#27889) 2023-12-08 14:55:29 +01:00
quantization Faster generation using AWQ + Fused modules (#27411) 2023-12-05 12:14:45 +01:00
repo_utils Allow # Ignore copy (#27328) 2023-12-07 10:00:08 +01:00
sagemaker Broken links fixed related to datasets docs (#27569) 2023-11-17 13:44:09 -08:00
tokenization [Styling] stylify using ruff (#27144) 2023-11-16 17:43:19 +01:00
tools Add support for for loops in python interpreter (#24429) 2023-06-26 09:58:14 -04:00
trainer Allow resume_from_checkpoint to handle auto_find_batch_size (#27568) 2023-12-08 11:51:02 -05:00
utils Update tiny model summary file (#27388) 2023-11-23 21:00:39 +01:00
__init__.py GPU text generation: mMoved the encoded_prompt to correct device 2020-01-06 15:11:12 +01:00
test_backbone_common.py [AutoBackbone] Add test (#26094) 2023-09-18 23:47:54 +02:00
test_cache_utils.py Generate: New Cache abstraction and Attention Sinks support (#26681) 2023-12-08 09:00:17 +01:00
test_configuration_common.py [ PretrainedConfig] Improve messaging (#27438) 2023-11-15 14:10:39 +01:00
test_configuration_utils.py Remove-auth-token (#27060) 2023-11-13 14:20:54 +01:00
test_feature_extraction_common.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00
test_feature_extraction_utils.py Remove-auth-token (#27060) 2023-11-13 14:20:54 +01:00
test_image_processing_common.py Input data format (#25464) 2023-08-16 17:45:02 +01:00
test_image_processing_utils.py Remove-auth-token (#27060) 2023-11-13 14:20:54 +01:00
test_image_transforms.py Normalize floating point cast (#27249) 2023-11-10 15:35:27 +00:00
test_modeling_common.py Generate: New Cache abstraction and Attention Sinks support (#26681) 2023-12-08 09:00:17 +01:00
test_modeling_flax_common.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00
test_modeling_flax_utils.py Default to msgpack for safetensors (#27460) 2023-11-13 15:17:01 +01:00
test_modeling_tf_common.py Deprecate TransfoXL (#27607) 2023-11-24 11:48:02 +01:00
test_modeling_tf_utils.py Default to msgpack for safetensors (#27460) 2023-11-13 15:17:01 +01:00
test_modeling_utils.py [⚠️ removed a default argument] Make AttentionMaskConverter compatible with torch.compile(..., fullgraph=True) (#27868) 2023-12-08 18:44:47 +09:00
test_pipeline_mixin.py Shorten the conversation tests for speed + fixing position overflows (#26960) 2023-10-31 14:20:04 +00:00
test_sequence_feature_extraction_common.py Fix typo (#25966) 2023-09-05 10:12:25 +02:00
test_tokenization_common.py [Styling] stylify using ruff (#27144) 2023-11-16 17:43:19 +01:00
test_tokenization_utils.py Remove-auth-token (#27060) 2023-11-13 14:20:54 +01:00