transformers/docs/source
Joao Gante cf32ee1753
Cache: use batch_size instead of max_batch_size (#32657)
* more precise name

* better docstrings

* Update src/transformers/cache_utils.py

Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>

---------

Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
2024-08-16 11:48:45 +01:00
..
de Skip tests properly (#31308) 2024-06-26 21:59:08 +01:00
en Cache: use batch_size instead of max_batch_size (#32657) 2024-08-16 11:48:45 +01:00
es 🚨 No more default chat templates (#31733) 2024-07-24 17:36:32 +01:00
fr Add French version of run scripts tutorial (#31483) 2024-06-28 18:02:30 +02:00
hi More fixes for doctest (#30265) 2024-04-16 11:58:55 +02:00
it Docs / Quantization: Replace all occurences of load_in_8bit with bnb config (#31136) 2024-05-30 16:47:35 +02:00
ja Generate: unify LogitsWarper and LogitsProcessor (#32626) 2024-08-16 11:20:41 +01:00
ko 🌐 [i18n-KO] Translated awq.mdto Korean (#32324) 2024-08-12 10:12:48 -07:00
ms Remove old TF port docs (#30426) 2024-04-23 16:06:20 +01:00
pt Use HF_HUB_OFFLINE + fix has_file in offline mode (#31016) 2024-05-29 11:55:43 +01:00
te docs: fix broken link (#31370) 2024-06-12 11:33:00 +01:00
tr Translate index.md to Turkish (#27093) 2023-11-08 08:35:20 -05:00
zh Generate: unify LogitsWarper and LogitsProcessor (#32626) 2024-08-16 11:20:41 +01:00
_config.py [#29174] ImportError Fix: Trainer with PyTorch requires accelerate>=0.20.1 Fix (#29888) 2024-04-08 14:21:16 +01:00