transformers/tests
Marvin Gabler 0a3b9d02fe
#26566 swin2 sr allow in out channels (#26568)
* feat: close #26566, changed model & config files to accept arbitary in and out channels

* updated docstrings

* fix: linter error

* fix: update Copy docstrings

* fix: linter update

* fix: rename num_channels_in to num_channels to prevent breaking changes

* fix: make num_channels_out None per default

* Update src/transformers/models/swin2sr/configuration_swin2sr.py

Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>

* fix: update tests to include num_channels_out

* fix:linter

* fix: remove normalization with precomputed rgb values when #input_channels!=#output_channels

---------

Co-authored-by: marvingabler <marvingabler@outlook.de>
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
2023-10-05 15:20:38 +02:00
..
benchmark
bettertransformer Add methods to PreTrainedModel to use PyTorch's BetterTransformer (#21259) 2023-04-27 11:03:42 +02:00
deepspeed fix the deepspeed tests (#26021) 2023-09-13 10:26:53 +05:30
extended [tests] switch to torchrun (#22712) 2023-04-12 08:25:45 -07:00
fixtures [WIP] add SpeechT5 model (#18922) 2023-02-03 12:43:46 -05:00
fsdp fix name error when accelerate is not available (#26278) 2023-09-20 08:02:55 +02:00
generation Fix beam search when using model parallel (#24969) 2023-09-14 11:00:52 -04:00
models #26566 swin2 sr allow in out channels (#26568) 2023-10-05 15:20:38 +02:00
optimization Make schedulers picklable by making lr_lambda fns global (#21768) 2023-03-02 12:08:43 -05:00
peft_integration [PEFT] Final fixes (#26559) 2023-10-03 14:53:09 +02:00
pipelines Add tokenizer kwargs to fill mask pipeline. (#26234) 2023-10-03 10:25:10 +02:00
quantization [PEFT] Final fixes (#26559) 2023-10-03 14:53:09 +02:00
repo_utils Docstring check (#26052) 2023-10-04 15:13:37 +02:00
sagemaker Avoid invalid escape sequences, use raw strings (#22936) 2023-04-25 09:17:56 -04:00
tokenization 🚨🚨 🚨🚨 [Tokenizer] attemp to fix add_token issues🚨🚨 🚨🚨 (#23909) 2023-09-18 20:28:36 +02:00
tools Add support for for loops in python interpreter (#24429) 2023-06-26 09:58:14 -04:00
trainer enable optuna multi-objectives feature (#25969) 2023-09-12 18:01:22 +01:00
utils Update tiny model information and pipeline tests (#26285) 2023-09-25 18:08:12 +02:00
__init__.py
test_backbone_common.py [AutoBackbone] Add test (#26094) 2023-09-18 23:47:54 +02:00
test_configuration_common.py Deal with nested configs better in base class (#25237) 2023-08-04 14:56:09 +02:00
test_configuration_utils.py Deal with nested configs better in base class (#25237) 2023-08-04 14:56:09 +02:00
test_feature_extraction_common.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00
test_feature_extraction_utils.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00
test_image_processing_common.py Input data format (#25464) 2023-08-16 17:45:02 +01:00
test_image_processing_utils.py Run hub tests (#24807) 2023-07-13 15:25:45 -04:00
test_image_transforms.py Add input_data_format argument, image transforms (#25462) 2023-08-11 15:09:31 +01:00
test_modeling_common.py [core] fix silent bug keep_in_fp32 modules (#26589) 2023-10-05 14:44:31 +02:00
test_modeling_flax_common.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00
test_modeling_flax_utils.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00
test_modeling_tf_common.py Skip test_onnx_runtime_optimize for now (#25560) 2023-08-17 11:23:16 +02:00
test_modeling_tf_utils.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00
test_modeling_utils.py skip flaky hub tests (#26594) 2023-10-04 17:47:55 +02:00
test_pipeline_mixin.py Add image to image pipeline (#25393) 2023-09-22 19:53:55 +03:00
test_sequence_feature_extraction_common.py Fix typo (#25966) 2023-09-05 10:12:25 +02:00
test_tokenization_common.py 🚨🚨 🚨🚨 [Tokenizer] attemp to fix add_token issues🚨🚨 🚨🚨 (#23909) 2023-09-18 20:28:36 +02:00
test_tokenization_utils.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00