* No more Tuple, List, Dict
* make fixup
* More style fixes
* Docstring fixes with regex replacement
* Trigger tests
* Redo fixes after rebase
* Fix copies
* [test all]
* update
* [test all]
* update
* [test all]
* make style after rebase
* Patch the hf_argparser test
* Patch the hf_argparser test
* style fixes
* style fixes
* style fixes
* Fix docstrings in Cohere test
* [test all]
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* Add implementation for DataCollatorForMultipleChoice based on docs.
* Add DataCollatorForMultipleChoice to import structure.
* Remove custom DataCollatorForMultipleChoice implementations from example scripts.
* Remove custom implementations of DataCollatorForMultipleChoice from docs in English, Spanish, Japanese and Korean.
* Refactor torch version of DataCollatorForMultipleChoice to be more easily understandable.
* Apply suggested changes and run make fixup.
* fix copies, style and fixup
* add missing documentation
* nits
* fix docstring
* style
* nits
* isort
---------
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
Co-authored-by: Arthur Zucker <arthur.zucker@gmail.com>
* Trainer - deprecate tokenizer for processing_class
* Extend chage across Seq2Seq trainer and docs
* Add tests
* Update to FutureWarning and add deprecation version
* No more default chat templates
* Add the template to the GPT-SW3 tests since it's not available by default now
* Fix GPT2 test
* Fix Bloom test
* Fix Bloom test
* Remove default templates again
* add tokenizer_summary to es/_toctree.yml
* add tokenizer_summary to es/
* fix link to Transformes XL in en/
* translate until Subword tokenization section
* fix GPT link in en/
* fix other GPT link in en/
* fix typo in en/
* translate the doc
* run make fixup
* Remove .md in Transformer XL link
* fix some link issues in es/
* fix typo
* Fix has_file in offline mode
* harmonize env variable for offline mode
* Switch to HF_HUB_OFFLINE
* fix test
* revert test_offline to test TRANSFORMERS_OFFLINE
* Add new offline test
* merge conflicts
* docs
* add model_memory_anatomy to es/_toctree.yml
* copy model_memory_anatomy.md to es/
* translate first section
* translate doc
* chage forward activations
* fix sentence and and link to Trainer
* fix Trainer link
* ImportError: Trainer with PyTorch requires accelerate>=0.20.1 Fix
Adding the evaluate and accelerate installs at the beginning of the cell to fix the issue
* ImportError Fix: Trainer with PyTorch requires accelerate>=0.20.1
* Import Error Fix
* Update installation.md
* Update quicktour.md
* rollback other lang changes
* Update _config.py
* updates for other languages
* fixing error
* Tutorial Update
* Update tokenization_utils_base.py
* Just use an optimizer string to pass the doctest?
---------
Co-authored-by: Matt <rocketknight1@gmail.com>
* torchscript and trainer md es translation
* corrected md es files and even corrected spelling in en md
* made es corrections to trainer.md
* deleted entrenamiento... title on yml
* placed entrenamiento in right place
* translated es chat_templating.md w/ yml addition
* requested es changes to md and yml
* last es changes to md
* torchscript and trainer md es translation
* corrected md es files and even corrected spelling in en md
* made es corrections to trainer.md
* deleted entrenamiento... title on yml
* placed entrenamiento in right place
* Add tasks_explained.md to es/
* Fix little typo in en/ version
* translate speach/audio section
* translate part of vision computer section | fix little typo in en/
* Fix little typo in en/
* Translate vision computer section | remove ** ** to * * in both files
* Translate NLP section | fix link to task/translation in en/
* Updete link in es/tasks_summary.md
* Fix task_summary title link
* Add task_summary to es/_toctree.yml
* Add task_summary.md to docs/es
* Change title of task_summary.md
* Translate firsts paragraphs
* Translate middle paragraphs
* Translte the rest of the doc
* Edit firts paragraph
* Add missing entries to the language selector
* Add links to the Colab and AWS Studio notebooks for ONNX
* Use anchor links in CONTRIBUTING.md
* Fix broken hyperlinks due to spaces
* Fix links to OpenAI research articles
* Remove confusing footnote symbols from author names, as they are also considered invalid markup
* Sort es/_toctree.yml like en/_toctree.yml
* Run make style
* Add -Rendimiento y escalabilidad- section to es/_toctree.yml
* Run make style
* Add s to section
* Add translate of performance.md
* Add performance.md to es/_toctree.yml
* Run make styele
* Fix docs links
* Run make style
* Add glossary to es/_toctree.yml
* Add glossary.md to es/
* A section translated
* B and C section translated
* Fix typo in en/glossary.md C section
* D section translated | Add a extra line in en/glossary.md
* E and F section translated | Fix typo in en/glossary.md
* Fix words preentrenado
* H and I section translated | Fix typo in en/glossary.md
* L section translated
* M and N section translated
* P section translated
* R section translated
* S section translated
* T section translated
* U and Z section translated | Fix TensorParallel link in both files
* Fix word
* Add pad_truncation to es/_toctree.yml
* Add pad_truncation.md to es/
* Translated first two paragraph
* Translated paddig argument section
* Translated truncation argument section
* Translated final paragraphs
* Translated table
* Fixed typo in the table of en/pad_truncation.md
* Run make style | Fix a word
* Add Padding (relleno) y el Truncation (truncamiento) in the final paragraphs
* Fix relleno and truncamiento words
* Copy perplexity.md file to es/ folder
* Adding perplexity to es/_toctree.yml
* Translate first section
* Calculating PPL section translate
* Example section translate
* fix translate of log-likehood
* Fix title translate
* Fix \ in second paragraph
* Change verosimilitud for log-likelihood
* Run 'make style'
* docs: replace torch.distributed.run by torchrun
`transformers` now officially support pytorch >= 1.10.
The entrypoint `torchrun`` is present from 1.10 onwards.
Signed-off-by: Peter Pan <Peter.Pan@daocloud.io>
* Update src/transformers/trainer.py
with @ArthurZucker's suggestion
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
---------
Signed-off-by: Peter Pan <Peter.Pan@daocloud.io>
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>