mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
![]() The LayerNorm gamma and beta should be initialized by .fill_(1.0) and .zero_(). reference links: |
||
---|---|---|
.. | ||
__init__.py | ||
__main__.py | ||
convert_tf_checkpoint_to_pytorch.py | ||
file_utils.py | ||
modeling.py | ||
optimization.py | ||
tokenization.py |