Rémi Ouazan
9ff246db00
Expectation fixes and added AMD expectations ( #38729 )
2025-06-13 16:14:58 +02:00
Lysandre Debut
54a123f068
Simplify soft dependencies and update the dummy-creation process ( #36827 )
...
* Reverse dependency map shouldn't be created when test_all is set
* [test_all] Remove dummies
* Modular fixes
* Update utils/check_repo.py
Co-authored-by: Pablo Montalvo <39954772+molbap@users.noreply.github.com>
* [test_all] Better docs
* [test_all] Update src/transformers/commands/chat.py
Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>
* [test_all] Remove deprecated AdaptiveEmbeddings from the tests
* [test_all] Doc builder
* [test_all] is_dummy
* [test_all] Import utils
* [test_all] Doc building should not require all deps
---------
Co-authored-by: Pablo Montalvo <39954772+molbap@users.noreply.github.com>
Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>
2025-04-11 11:08:36 +02:00
cyyever
1e6b546ea6
Use Python 3.9 syntax in tests ( #37343 )
...
Signed-off-by: cyy <cyyever@outlook.com>
2025-04-08 14:12:08 +02:00
Pavel Iakubovskii
48461c0fe2
Make pipeline
able to load processor
( #32514 )
...
* Refactor get_test_pipeline
* Fixup
* Fixing tests
* Add processor loading in tests
* Restructure processors loading
* Add processor to the pipeline
* Move model loading on tom of the test
* Update `get_test_pipeline`
* Fixup
* Add class-based flags for loading processors
* Change `is_pipeline_test_to_skip` signature
* Skip t5 failing test for slow tokenizer
* Fixup
* Fix copies for T5
* Fix typo
* Add try/except for tokenizer loading (kosmos-2 case)
* Fixup
* Llama not fails for long generation
* Revert processor pass in text-generation test
* Fix docs
* Switch back to json file for image processors and feature extractors
* Add processor type check
* Remove except for tokenizers
* Fix docstring
* Fix empty lists for tests
* Fixup
* Fix load check
* Ensure we have non-empty test cases
* Update src/transformers/pipelines/__init__.py
Co-authored-by: Lysandre Debut <hi@lysand.re>
* Update src/transformers/pipelines/base.py
Co-authored-by: Lysandre Debut <hi@lysand.re>
* Rework comment
* Better docs, add note about pipeline components
* Change warning to error raise
* Fixup
* Refine pipeline docs
---------
Co-authored-by: Lysandre Debut <hi@lysand.re>
2024-10-09 16:46:11 +01:00
Billy Cao
ac26260436
Allow FP16 or other precision inference for Pipelines ( #31342 )
...
* cast image features to model.dtype where needed to support FP16 or other precision in pipelines
* Update src/transformers/pipelines/image_feature_extraction.py
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* Use .to instead
* Add FP16 pipeline support for zeroshot audio classification
* Remove unused torch imports
* Add docs on FP16 pipeline
* Remove unused import
* Add FP16 tests to pipeline mixin
* Add fp16 placeholder for mask_generation pipeline test
* Add FP16 tests for all pipelines
* Fix formatting
* Remove torch_dtype arg from is_pipeline_test_to_skip*
* Fix format
* trigger ci
---------
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
2024-07-05 17:21:50 +01:00
amyeroberts
1de7dc7403
Skip tests properly ( #31308 )
...
* Skip tests properly
* [test_all]
* Add 'reason' as kwarg for skipTest
* [test_all] Fix up
* [test_all]
2024-06-26 21:59:08 +01:00
Lucain
fd65aa9818
Set usedforsecurity=False
in hashlib methods (FIPS compliance) ( #27483 )
...
* Set usedforsecurity=False in hashlib methods (FIPS compliance)
* trigger ci
* tokenizers version
* deps
* bump hfh version
* let's try this
2023-11-16 14:29:53 +00:00
Yih-Dar
c23d131eab
Update tiny models for pipeline testing. ( #24364 )
...
* fix
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-06-20 14:43:10 +02:00
Yih-Dar
975159bb61
Update tiny models and a few fixes ( #22928 )
...
* run_check_tiny_models
* update summary
* update mixin
* update pipeline_model_mapping
* update pipeline_model_mapping
* Update for gpt_bigcode
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-04-24 14:45:22 +02:00
Arthur
f143037789
Add automatic-mask-generation
pipeline for Segment Anything Model (SAM) ( #22840 )
...
* cleanup
* updates
* more refactoring
* make style
* update inits
* support other inputs in base
* update based on review
Co-authored-by: Nicolas Patry <patry.nicolas@gmail.com>
* Update tests/pipelines/test_pipelines_automatic_mask_generation.py
Co-authored-by: Nicolas Patry <patry.nicolas@protonmail.com>
* update
* fixup
* TODO x and y to refactor, _h _w refactored here
* update docstring
* more nits
* style on these
* more doc fix
* rename variables
* update
* updates
* style
* update
* fix `_mask_to_rle_pytorch`
* styling
* fix ask to rle, wrong outputs
* add device arg
* update
* more updates, fix tets
* udpate
* update docstrings
* styling
* fixup
* add notebook on the docs
* update orginal sizes
* fix docstring
* updat condition on point_per-batch
* updates tests
* fix CI test
* extend is required, append does not work!
* fixup
* fix CI tests
* whit pixels left
* address doc comments
* fix doc
* slow pipeline tests
* update auto init
* add revision
* make fixup
* update p!ipoeline tag when calling tests
* alphabeitcal order in inits
* fix copies
* last style nits
* Apply suggestions from code review
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* reformat docstring
* more reformat
* address most of the comments
* Update src/transformers/pipelines/mask_generation.py
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* final refactor
* Update src/transformers/models/sam/image_processing_sam.py
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* fixup and fix slow tests
* revert
---------
Co-authored-by: Nicolas Patry <patry.nicolas@gmail.com>
Co-authored-by: Nicolas Patry <patry.nicolas@protonmail.com>
Co-authored-by: younesbelkada <younesbelkada@gmail.com>
Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
2023-04-20 19:27:24 +02:00