mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
Fix minor typo: softare => software (#29602)
This commit is contained in:
parent
6cc5411d81
commit
73efe896df
@ -65,7 +65,7 @@ training your model with [`Trainer`] or writing a pure PyTorch loop, in which ca
|
||||
with 🤗 Accelerate](#using--accelerate).
|
||||
|
||||
If these methods do not result in sufficient gains, you can explore the following options:
|
||||
* [Look into building your own custom Docker container with efficient softare prebuilds](#efficient-software-prebuilds)
|
||||
* [Look into building your own custom Docker container with efficient software prebuilds](#efficient-software-prebuilds)
|
||||
* [Consider a model that uses Mixture of Experts (MoE)](#mixture-of-experts)
|
||||
* [Convert your model to BetterTransformer to leverage PyTorch native attention](#using-pytorch-native-attention-and-flash-attention)
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user