mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
Make clearer about zero_init requirements (#29879)
* Docstring to note about zero init * Check for accelerate * Change conditional return * Tweak * Add new accelerate-specific zero3 check * Fix import * Revert to RTFM * Update src/transformers/modeling_utils.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> --------- Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
This commit is contained in:
parent
695d823323
commit
863e2562d8
@ -504,6 +504,11 @@ class TrainingArguments:
|
||||
evolve in the future. The value is either the location of DeepSpeed json config file (e.g.,
|
||||
`ds_config.json`) or an already loaded json file as a `dict`"
|
||||
|
||||
<Tip warning={true}>
|
||||
If enabling any Zero-init, make sure that your model is not initialized until
|
||||
*after* initializing the `TrainingArguments`, else it will not be applied.
|
||||
</Tip>
|
||||
|
||||
accelerator_config (`str`, `dict`, or `AcceleratorConfig`, *optional*):
|
||||
Config to be used with the internal `Accelerator` implementation. The value is either a location of
|
||||
accelerator json config file (e.g., `accelerator_config.json`), an already loaded json file as `dict`,
|
||||
|
Loading…
Reference in New Issue
Block a user