Yih-Dar
bd90cda9a6
CI with num_hidden_layers=2
🚀 🚀 🚀 ( #25266 )
...
* CI with layers=2
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-08-02 20:22:36 +02:00
Yih-Dar
1b4f6199c6
Update tiny model info. and pipeline testing ( #25213 )
...
* update tiny_model_summary.json
* update
* update
* update
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-07-31 19:35:33 +02:00
Sebastian Husch Lee
8f36ab3e22
[T5
, MT5
, UMT5
] Add [T5, MT5, UMT5]ForSequenceClassification ( #24726 )
...
* Initial addition of t5forsequenceclassification
* Adding imports and adding tests
* Formatting
* Running make fix-copies
* Adding mt5forseq
* Formatting
* run make fix-copies
* Adding to docs
* Add model_parallel
* Fix bug
* Fix
* Remove TODO
* Fixing tests for T5ForSequenceClassification
* Undo changes to dependency_versions_table.py
* Change classification head to work with T5Config directly
* Change seq length to let tests pass
* PR comments for formatting
* Formatting
* Initial addition of UMT5ForSequenceClassification
* Adding to inits and formatting
* run make fix-copies
* Add doc for UMT5ForSeqClass
* Update UMT5 config
* Fix docs
* Skip torch fx test for SequenceClassification
* Formatting
* Add skip to UMT5 tests as well
* Fix umt5 tests
* Running make fix-copies
* PR comments
* Fix for change to sentence_representation
* Rename seq_len to hidden_size since that's what it is
* Use base_model to follow format of the rest of the library
* Update docs
* Extract the decoder_input_ids changes and make one liner
* Make one-liner
2023-07-25 21:02:49 +02:00
Arthur
b15343de6f
[Patch-t5-tokenizer] Patches the changes on T5 to make sure previous behaviour is still valide for beginning of words ( #24622 )
...
* patch `_tokenize` function
* more tests
* properly fix
* fixup
* Update src/transformers/models/t5/tokenization_t5.py
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* fix without ifs
* update
* protect import
* add python processing
* is first needed
* add doc and update with lefacy
* updaate
* fix T5 SPM converter
* styling
* fix T5 warning
* add is_seqio_available
* remove is_first
* revert some changes
* more tests and update
* update llama test batterie
* fixup
* refactor T5 spm common tests
* draft the llama tests
* update
* uopdate test
* nits
* refine
* name nit
* fix t5 tests
* fix T5
* update
* revert convert slow to fast changes that fail lots of tests
* legacy support
* fixup
* nits is first not defined
* don't use legacy behaviour for switch transformers
* style
* My attempt to check.
* nits
* fixes
* update
* fixup
* Apply suggestions from code review
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* updates
* fixup
* add legacy warning
* fixup
* warning_once nit
* update t5 documentation test
* update llama tok documentation
* add space to warning
* nits
* nit
* Apply suggestions from code review
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* last nits
---------
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
Co-authored-by: Nicolas Patry <patry.nicolas@protonmail.com>
2023-07-11 15:02:18 +02:00
Arthur
fb78769b9c
[MT5
] Fix CONFIG_MAPPING issue leading it to load umt5 class ( #24678 )
...
* update
* add umt5 to auto tokenizer mapping
* nits
* fixup
* fix failing torch test
2023-07-07 11:33:54 +09:00
Arthur
799df10aef
[Umt5
] Add google's umt5 to transformers
( #24477 )
...
* add tokenization template
* update conversion script
* update modeling code
* update
* update convert checkpoint
* update modeling
* revert changes on convert script
* new conversion script for new format
* correct position bias
* cleaning a bit
* Credit co authors
Co-authored-by: agemagician
<ahmed.elnaggar@tum.de>
Co-authored-by: stefan-it
<>
* styling
* Add docq
* fix copies
* add co author
* Other Author
* Merge branch 'main' of https://github.com/huggingface/transformers into add-umt5
* add testing
* nit
* Update docs/source/en/model_doc/umt5.mdx
Co-authored-by: Stefan Schweter <stefan@schweter.it>
* fix t5
* actual fix?
* revert wrong changes
* remove
* update test
* more fixes
* revert some changes
* add SPIECE_UNDERLINE
* add a commone xample
* upfate
* fix copies
* revert changes on t5 conversion script
* revert bytefallback changes since there was no addition yet
* fixup
* fixup
* ingore umt5 cutom testing folder
* fix readmes
* revertT5 changes
* same outputs
* fixup
* update example
* Apply suggestions from code review
* style
* draft addition of all new files
* current update
* fix attention and stuff
* finish refactoring
* auto config
* fixup
* more nits
* add umt5 to init
* use md format
* Update README.md
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
* revert changes on mt5
* revert mt4 changes
* update test
* more fixes
* add to mapping
* fix-copies
* fix copies
* foix retain grad
* fix some tests
* nits
* done
* Update src/transformers/models/umt5/modeling_umt5.py
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
* Update docs/source/en/model_doc/umt5.md
* Update src/transformers/models/umt5/__init__.py
* Update docs/source/en/model_doc/umt5.md
Co-authored-by: Stefan Schweter <stefan@schweter.it>
* Update src/transformers/models/umt5/modeling_umt5.py
* update conversion script + use google checkpoints
* nits
* update test and modelling
* stash slow convert
* update fixupd
* don't change slow
---------
Co-authored-by: stefan-it <>
Co-authored-by: Stefan Schweter <stefan@schweter.it>
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
2023-07-03 07:38:21 +02:00