transformers/docker
Yih-Dar 4143f94d51
uninstall kernels from docker images (#38083)
uninstall kernels

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2025-05-12 18:03:47 +02:00
..
transformers-all-latest-gpu uninstall kernels from docker images (#38083) 2025-05-12 18:03:47 +02:00
transformers-doc-builder Use python 3.10 for docbuild (#28399) 2024-01-11 14:39:49 +01:00
transformers-gpu TF: TF 2.10 unpin + related onnx test skips (#18995) 2022-09-12 19:30:27 +01:00
transformers-past-gpu DeepSpeed github repo move sync (#36021) 2025-02-05 08:19:31 -08:00
transformers-pytorch-amd-gpu Update amd pytorch index to match base image (#36347) 2025-02-24 16:17:20 +01:00
transformers-pytorch-deepspeed-amd-gpu AMD DeepSpeed image additional HIP dependencies (#36195) 2025-02-17 11:50:49 +01:00
transformers-pytorch-deepspeed-latest-gpu uninstall kernels from docker images (#38083) 2025-05-12 18:03:47 +02:00
transformers-pytorch-deepspeed-nightly-gpu uninstall kernels from docker images (#38083) 2025-05-12 18:03:47 +02:00
transformers-pytorch-gpu uninstall kernels from docker images (#38083) 2025-05-12 18:03:47 +02:00
transformers-pytorch-tpu Rename master to main for notebooks links and leftovers (#16397) 2022-03-25 09:12:23 -04:00
transformers-quantization-latest-gpu uninstall kernels from docker images (#38083) 2025-05-12 18:03:47 +02:00
transformers-tensorflow-gpu pin tensorflow_probability<0.22 in docker files (#34381) 2024-10-28 11:59:46 +01:00
consistency.dockerfile Pin torch == 2.6 on PR CI docker images for now (#37695) 2025-04-23 11:47:23 +02:00
custom-tokenizers.dockerfile Pin torch == 2.6 on PR CI docker images for now (#37695) 2025-04-23 11:47:23 +02:00
examples-tf.dockerfile Updated docker files to use uv for installing packages (#36957) 2025-03-25 18:12:51 +01:00
examples-torch.dockerfile Pin torch == 2.6 on PR CI docker images for now (#37695) 2025-04-23 11:47:23 +02:00
exotic-models.dockerfile Pin torch == 2.6 on PR CI docker images for now (#37695) 2025-04-23 11:47:23 +02:00
jax-light.dockerfile Updated docker files to use uv for installing packages (#36957) 2025-03-25 18:12:51 +01:00
pipeline-tf.dockerfile Updated docker files to use uv for installing packages (#36957) 2025-03-25 18:12:51 +01:00
pipeline-torch.dockerfile Pin torch == 2.6 on PR CI docker images for now (#37695) 2025-04-23 11:47:23 +02:00
quality.dockerfile Updated docker files to use uv for installing packages (#36957) 2025-03-25 18:12:51 +01:00
README.md Updated docker files to use uv for installing packages (#36957) 2025-03-25 18:12:51 +01:00
tf-light.dockerfile Updated docker files to use uv for installing packages (#36957) 2025-03-25 18:12:51 +01:00
torch-jax-light.dockerfile Updated docker files to use uv for installing packages (#36957) 2025-03-25 18:12:51 +01:00
torch-light.dockerfile Pin torch == 2.6 on PR CI docker images for now (#37695) 2025-04-23 11:47:23 +02:00
torch-tf-light.dockerfile Pin torch == 2.6 on PR CI docker images for now (#37695) 2025-04-23 11:47:23 +02:00

Dockers for transformers

In this folder you will find various docker files, and some subfolders.

  • dockerfiles (ex: consistency.dockerfile) present under ~/docker are used for our "fast" CIs. You should be able to use them for tasks that only need CPU. For example torch-light is a very light weights container (703MiB).
  • subfolders contain dockerfiles used for our slow CIs, which can be used for GPU tasks, but they are BIG as they were not specifically designed for a single model / single task. Thus the ~/docker/transformers-pytorch-gpu includes additional dependencies to allow us to run ALL model tests (say librosa or tesseract, which you do not need to run LLMs)

Note that in both case, you need to run uv pip install -e ., which should take around 5 seconds. We do it outside the dockerfile for the need of our CI: we checkout a new branch each time, and the transformers code is thus updated.

We are open to contribution, and invite the community to create dockerfiles with potential arguments that properly choose extras depending on the model's dependencies! 🤗