transformers/docs/source
pglorio 33cb1f7b61
Add Zamba2 (#34517)
* First commit

* Finish model implementation

* First commit

* Finish model implementation

* Register zamba2

* generated modeling and configuration

* generated modeling and configuration

* added hybrid cache

* fix attention_mask in mamba

* dropped unused loras

* fix flash2

* config docstrings

* fix config and fwd pass

* make fixup fixes

* text_modeling_zamba2

* small fixes

* make fixup fixes

* Fix modular model converter

* added inheritances in modular, renamed zamba cache

* modular rebase

* new modular conversion

* fix generated modeling file

* fixed import for Zamba2RMSNormGated

* modular file cleanup

* make fixup and model tests

* dropped inheritance for Zamba2PreTrainedModel

* make fixup and unit tests

* Add inheritance of rope from GemmaRotaryEmbedding

* moved rope to model init

* drop del self.self_attn and del self.feed_forward

* fix tests

* renamed lora -> adapter

* rewrote adapter implementation

* fixed tests

* Fix torch_forward in mamba2 layer

* Fix torch_forward in mamba2 layer

* Fix torch_forward in mamba2 layer

* Dropped adapter in-place sum

* removed rope from attention init

* updated rope

* created get_layers method

* make fixup fix

* make fixup fixes

* make fixup fixes

* update to new attention standard

* update to new attention standard

* make fixup fixes

* minor fixes

* cache_position

* removed cache_position postion_ids use_cache

* remove config from modular

* removed config from modular (2)

* import apply_rotary_pos_emb from llama

* fixed rope_kwargs

* Instantiate cache in Zamba2Model

* fix cache

* fix @slow decorator

* small fix in modular file

* Update docs/source/en/model_doc/zamba2.md

Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>

* several minor fixes

* inherit mamba2decoder fwd and drop position_ids in mamba

* removed docstrings from modular

* reinstate zamba2 attention decoder fwd

* use regex for tied keys

* Revert "use regex for tied keys"

This reverts commit 9007a522b1.

* use regex for tied keys

* add cpu to slow forward tests

* dropped config.use_shared_mlp_adapter

* Update docs/source/en/model_doc/zamba2.md

Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>

* re-convert from modular

---------

Co-authored-by: root <root@node-2.us-southcentral1-a.compute.internal>
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
2025-01-27 10:51:23 +01:00
..
ar Remove old benchmark code (#35730) 2025-01-21 17:56:43 +00:00
de Fix typos in translated quicktour docs (#35302) 2024-12-17 09:32:00 -08:00
en Add Zamba2 (#34517) 2025-01-27 10:51:23 +01:00
es Fix typos in translated quicktour docs (#35302) 2024-12-17 09:32:00 -08:00
fr Add French translation of task_summary and tasks_explained (#33407) 2025-01-06 14:23:52 +01:00
hi [i18n-HI] Translated TFLite page to Hindi (#34572) 2024-11-04 09:40:30 -08:00
it Fix typos in translated quicktour docs (#35302) 2024-12-17 09:32:00 -08:00
ja [DOC] Fix contamination and missing paragraph in translation (#35851) 2025-01-23 08:33:44 -08:00
ko Remove old benchmark code (#35730) 2025-01-21 17:56:43 +00:00
ms Remove old benchmark code (#35730) 2025-01-21 17:56:43 +00:00
pt Fix typos in translated quicktour docs (#35302) 2024-12-17 09:32:00 -08:00
te Fix typos in translated quicktour docs (#35302) 2024-12-17 09:32:00 -08:00
tr Translate index.md to Turkish (#27093) 2023-11-08 08:35:20 -05:00
zh Remove old benchmark code (#35730) 2025-01-21 17:56:43 +00:00
_config.py [#29174] ImportError Fix: Trainer with PyTorch requires accelerate>=0.20.1 Fix (#29888) 2024-04-08 14:21:16 +01:00