This website requires JavaScript.
Explore
Help
Sign In
riaz.somc
/
transformers
Watch
1
Star
0
Fork
0
You've already forked transformers
mirror of
https://github.com/huggingface/transformers.git
synced
2025-08-03 03:31:05 +06:00
Code
Issues
Actions
2
Packages
Projects
Releases
Wiki
Activity
471958b620
transformers
/
tests
/
models
/
vipllava
History
Raushan Turganbay
1cfcbfcab8
[VLMs] fix flash-attention tests (
#37603
)
...
* fix one test * fa2 ln test * remove keys from config recursively * fix * fixup
2025-04-24 11:48:11 +02:00
..
__init__.py
Adds VIP-llava to transformers (
#27932
)
2023-12-13 10:42:24 +01:00
test_modeling_vipllava.py
[VLMs] fix flash-attention tests (
#37603
)
2025-04-24 11:48:11 +02:00