mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-23 14:29:01 +06:00
![]() * Fix inverted conditional in TF common test! * Make the same change in the PT tests file * Make sure hidden states for GPT2 have the same output shape in PT/TF * Minor fix to PT implementation of token classification loss * Skip loss equivalence test for TFHubert because it keeps overflowing to inf * Compute LM loss for TF the (weird) way it's computed in PT * Skip loss equivalence test for Wav2Vec2 for the same reason as Hubert * Fix - don't try to access the hidden states property when output is a tuple |
||
---|---|---|
.. | ||
__init__.py | ||
test_modeling_hubert.py | ||
test_modeling_tf_hubert.py |