transformers/examples/modular-transformers
efsotr 2b4734bd49
Support passing flash_attn_kwargs when gradient_checkpointing is enabled (#37037)
* support passing flash_attn_kwargs when gradient_checkpointing is enabled

* make modeling_deepspeek_v3.py consistent with modular_deepseek_v3.py
2025-03-31 10:53:02 +02:00
..
configuration_dummy.py Modular transformers: modularity and inheritance for new model additions (#33248) 2024-09-24 15:54:07 +02:00
configuration_my_new_model.py [modular] Do not track imports in functions (#36279) 2025-02-25 10:29:47 +01:00
configuration_my_new_model2.py [modular] Do not track imports in functions (#36279) 2025-02-25 10:29:47 +01:00
configuration_new_model.py [modular] Do not track imports in functions (#36279) 2025-02-25 10:29:47 +01:00
configuration_super.py Modular transformers: modularity and inheritance for new model additions (#33248) 2024-09-24 15:54:07 +02:00
convert_examples.sh [modular] fixes! (#33820) 2024-09-30 16:43:55 +02:00
image_processing_new_imgproc_model.py [modular] Do not track imports in functions (#36279) 2025-02-25 10:29:47 +01:00
modeling_add_function.py Modular: support for importing functions from any file (#35692) 2025-01-16 16:37:53 +00:00
modeling_dummy_bert.py Large modular logic refactoring (#34487) 2024-11-01 10:13:51 +01:00
modeling_dummy.py Support passing flash_attn_kwargs when gradient_checkpointing is enabled (#37037) 2025-03-31 10:53:02 +02:00
modeling_from_uppercase_model.py Support for easier multimodal use of modular (#35056) 2024-12-04 15:13:11 +01:00
modeling_multimodal1.py Support passing flash_attn_kwargs when gradient_checkpointing is enabled (#37037) 2025-03-31 10:53:02 +02:00
modeling_multimodal2.py update examples after ruff being updated (#36972) 2025-03-25 18:15:47 +01:00
modeling_my_new_model2.py [modular] Do not track imports in functions (#36279) 2025-02-25 10:29:47 +01:00
modeling_new_task_model.py Fix doc formatting in forward passes & modular (#36243) 2025-02-25 11:09:01 +01:00
modeling_roberta.py Large modular logic refactoring (#34487) 2024-11-01 10:13:51 +01:00
modeling_super.py [modular] Do not track imports in functions (#36279) 2025-02-25 10:29:47 +01:00
modeling_switch_function.py Modular: support for importing functions from any file (#35692) 2025-01-16 16:37:53 +00:00
modular_add_function.py Modular: support for importing functions from any file (#35692) 2025-01-16 16:37:53 +00:00
modular_dummy_bert.py Modular transformers: modularity and inheritance for new model additions (#33248) 2024-09-24 15:54:07 +02:00
modular_dummy.py [modular] fixes! (#33820) 2024-09-30 16:43:55 +02:00
modular_from_uppercase_model.py Support for easier multimodal use of modular (#35056) 2024-12-04 15:13:11 +01:00
modular_multimodal1.py Support for easier multimodal use of modular (#35056) 2024-12-04 15:13:11 +01:00
modular_multimodal2.py Support for easier multimodal use of modular (#35056) 2024-12-04 15:13:11 +01:00
modular_my_new_model.py Modular transformers: modularity and inheritance for new model additions (#33248) 2024-09-24 15:54:07 +02:00
modular_my_new_model2.py Modular transformers: modularity and inheritance for new model additions (#33248) 2024-09-24 15:54:07 +02:00
modular_new_imgproc_model.py Fix support for image processors modifications in modular (#34866) 2024-11-22 18:14:24 -05:00
modular_new_model.py Modular transformers: modularity and inheritance for new model additions (#33248) 2024-09-24 15:54:07 +02:00
modular_new_task_model.py Refactoring of ImageProcessorFast (#35069) 2025-02-04 17:52:31 -05:00
modular_roberta.py Large modular logic refactoring (#34487) 2024-11-01 10:13:51 +01:00
modular_super.py Modular transformers: modularity and inheritance for new model additions (#33248) 2024-09-24 15:54:07 +02:00
modular_switch_function.py Modular: support for importing functions from any file (#35692) 2025-01-16 16:37:53 +00:00
README.md Modular transformers: modularity and inheritance for new model additions (#33248) 2024-09-24 15:54:07 +02:00

Using the modular_converter linter

pip install libcst is a must!

sh examples/modular-transformers/convert_examples.sh to get the converted outputs

The modular converter is a new linter specific to transformers. It allows us to unpack inheritance in python to convert a modular file like modular_gemma.py into a single model single file.

Examples of possible usage are available in the examples/modular-transformers, or modular_gemma for a full model usage.

python utils/modular_model_converter.py --files_to_parse "/Users/arthurzucker/Work/transformers/examples/modular-transformers/modular_my_new_model2.py"

How it works

We use the libcst parser to produce an AST representation of the modular_xxx.py file. For any imports that are made from transformers.models.modeling_xxxx we parse the source code of that module, and build a class dependency mapping, which allows us to unpack the modularerence dependencies.

The code from the modular file and the class dependency mapping are "merged" to produce the single model single file. We use ruff to automatically remove the potential duplicate imports.

Why we use libcst instead of the native AST?

AST is super powerful, but it does not keep the docstring, comment or code formatting. Thus we decided to go with libcst