transformers/docker
ivarflakstad bc6ae0d55e
Update AMD docker image (rocm 6.1) (#35259)
* Use rocm 6.3 as base amd image and add nvidia-ml-py to exclude list

* Align rocm base image with torch wheels @6.1. Seems like the most stable combo
2024-12-13 15:41:03 +01:00
..
transformers-all-latest-gpu Use torch 2.5 in scheduled CI (#34465) 2024-10-30 14:54:10 +01:00
transformers-doc-builder
transformers-gpu
transformers-past-gpu
transformers-pytorch-amd-gpu Update AMD docker image (rocm 6.1) (#35259) 2024-12-13 15:41:03 +01:00
transformers-pytorch-deepspeed-amd-gpu
transformers-pytorch-deepspeed-latest-gpu
transformers-pytorch-deepspeed-nightly-gpu
transformers-pytorch-gpu Use torch 2.5 in scheduled CI (#34465) 2024-10-30 14:54:10 +01:00
transformers-pytorch-tpu
transformers-quantization-latest-gpu Fix docker CI : install autogptq from source (#35000) 2024-11-28 16:31:36 +01:00
transformers-tensorflow-gpu pin tensorflow_probability<0.22 in docker files (#34381) 2024-10-28 11:59:46 +01:00
consistency.dockerfile Support reading tiktoken tokenizer.model file (#31656) 2024-09-06 14:24:02 +02:00
custom-tokenizers.dockerfile
examples-tf.dockerfile
examples-torch.dockerfile
exotic-models.dockerfile
jax-light.dockerfile
pipeline-tf.dockerfile
pipeline-torch.dockerfile
quality.dockerfile
README.md Add documentation for docker (#33156) 2024-10-14 11:58:45 +02:00
tf-light.dockerfile
torch-jax-light.dockerfile
torch-light.dockerfile Support reading tiktoken tokenizer.model file (#31656) 2024-09-06 14:24:02 +02:00
torch-tf-light.dockerfile

Dockers for transformers

In this folder you will find various docker files, and some subfolders.

  • dockerfiles (ex: consistency.dockerfile) present under ~/docker are used for our "fast" CIs. You should be able to use them for tasks that only need CPU. For example torch-light is a very light weights container (703MiB).
  • subfloder contain dockerfiles used for our slow CIs, which can be used for GPU tasks, but they are BIG as they were not specifically designed for a single model / single task. Thus the ~/docker/transformers-pytorch-gpu includes additional dependencies to allow us to run ALL model tests (say librosa or tesseract, which you do not need to run LLMs)

Note that in both case, you need to run uv pip install -e ., which should take around 5 seconds. We do it outside the dockerfile for the need of our CI: we checkout a new branch each time, and the transformers code is thus updated.

We are open to contribution, and invite the community to create dockerfiles with potential arguments that properly choose extras depending on the model's dependencies! 🤗