mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-23 06:20:22 +06:00
![]() * 1,100%! * Clean * Don't touch DS * Experiment with dtype allocation * skip test_load_save_without_tied_weights test * A little faster * Include proper upscaling? * Fixup tests * Potentially skip? * Let's see if this fixes git history * Maintain new dtype * Fin * Rm hook idea for now * New approach, see what breaks * stage * Clean * Stash * Should be fin now, just need to mark failing models * Clean up * Simplify * Deal with weird models * Enc/Dec * Skip w/ reason * Adjust test * Fix test * one more test * Keep experimenting * Fix ref * TO REMOVE: testing feedback CI * Right push * Update tests/utils/test_modeling_utils.py Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com> * disable * Add new func * Test nits from Amy * Update src/transformers/modeling_utils.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Adjust comment * Adjust comment on skip * make private * Fin * Should be a not flag * Clarify and rename test --------- Co-authored-by: Marc Sun <marc@huggingface.co> Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com> Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> |
||
---|---|---|
.. | ||
agent.md | ||
backbones.md | ||
callback.md | ||
configuration.md | ||
data_collator.md | ||
deepspeed.md | ||
feature_extractor.md | ||
image_processor.md | ||
keras_callbacks.md | ||
logging.md | ||
model.md | ||
onnx.md | ||
optimizer_schedules.md | ||
output.md | ||
pipelines.md | ||
processors.md | ||
quantization.md | ||
text_generation.md | ||
tokenizer.md | ||
trainer.md |