transformers/docker
Lysandre Debut a42844955f
Loading GGUF files support (#30391)
* Adds support for loading GGUF files

Co-authored-by: Younes Belkada <younesbelkada@gmail.com>
Co-authored-by: 99991 <99991@users.noreply.github.com>

* add q2_k q3_k q5_k support from @99991

* fix tests

* Update doc

* Style

* Docs

* fix CI

* Update docs/source/en/gguf.md

* Update docs/source/en/gguf.md

* Compute merges

* change logic

* add comment for clarity

* add comment for clarity

* Update src/transformers/models/auto/tokenization_auto.py

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>

* change logic

* Update src/transformers/modeling_utils.py

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>

* change

* Apply suggestions from code review

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>

* Update src/transformers/modeling_gguf_pytorch_utils.py

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>

* put back comment

* add comment about mistral

* comments and added tests

* fix unconsistent type

* more

* fix tokenizer

* Update src/transformers/modeling_utils.py

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>

* address comments about tests and tokenizer + add added_tokens

* from_gguf -> gguf_file

* replace on docs too

---------

Co-authored-by: Younes Belkada <younesbelkada@gmail.com>
Co-authored-by: 99991 <99991@users.noreply.github.com>
Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
2024-05-15 14:28:20 +02:00
..
transformers-all-latest-gpu Loading GGUF files support (#30391) 2024-05-15 14:28:20 +02:00
transformers-doc-builder Use python 3.10 for docbuild (#28399) 2024-01-11 14:39:49 +01:00
transformers-gpu TF: TF 2.10 unpin + related onnx test skips (#18995) 2022-09-12 19:30:27 +01:00
transformers-past-gpu Byebye pytorch 1.9 (#24080) 2023-06-16 16:38:23 +02:00
transformers-pytorch-amd-gpu CI: update to ROCm 6.0.2 and test MI300 (#30266) 2024-05-13 18:14:36 +02:00
transformers-pytorch-deepspeed-amd-gpu Disable AMD memory benchmarks (#29871) 2024-03-26 14:43:12 +01:00
transformers-pytorch-deepspeed-latest-gpu Pin deepspeed (#30701) 2024-05-07 13:45:24 -04:00
transformers-pytorch-deepspeed-nightly-gpu Update CUDA versions for DeepSpeed (#27853) 2023-12-05 16:15:21 -05:00
transformers-pytorch-gpu [SDPA] Make sure attn mask creation is always done on CPU (#28400) 2024-01-09 11:05:19 +01:00
transformers-pytorch-tpu Rename master to main for notebooks links and leftovers (#16397) 2022-03-25 09:12:23 -04:00
transformers-quantization-latest-gpu Add HQQ quantization support (#29637) 2024-05-02 17:51:49 +01:00
transformers-tensorflow-gpu Update TF pin in docker image (#25343) 2023-08-07 12:32:34 +02:00
consistency.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
custom-tokenizers.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
examples-tf.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
examples-torch.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
exotic-models.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
jax-light.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
pipeline-tf.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
pipeline-torch.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
quality.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
tf-light.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
torch-jax-light.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
torch-light.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00
torch-tf-light.dockerfile [CI update] Try to use dockers and no cache (#29202) 2024-05-06 10:10:32 +02:00