mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
![]() * add Cambricon MLUs support * fix mlu device rng state * up for quality check * up mlu to support fp16 * fix mlu device dependency error * fix mlu device dependency error * enable mlu device for bf16 * fix mlu device memory tracker * Cambricon support SDPA and flash_attn * MLU devices : Checks if `mlu` is available via an `cndev-based` check which won't trigger the drivers and leave mlu * Fix mlu FA2 check. Remove deepspeed-mlu check. add mlu tests support. * fix testing errors. * Merge branch 'hf/main' into main * fix get_device_count error. * fix mlu testing utils. * fix code quality and style. * switch to @require_torch_multi_accelerator |
||
---|---|---|
.. | ||
__init__.py | ||
test_beam_constraints.py | ||
test_beam_search.py | ||
test_candidate_generator.py | ||
test_configuration_utils.py | ||
test_fsdp.py | ||
test_logits_process.py | ||
test_stopping_criteria.py | ||
test_streamers.py | ||
test_utils.py |