mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
Add flax/jax quickstart (#12342)
This commit is contained in:
parent
5b1b5635d3
commit
f2c4ce7e33
@ -120,7 +120,9 @@ TODO (should be filled by 24.06.)...
|
||||
|
||||
## Quickstart flax and jax
|
||||
|
||||
TODO (should be filled by 25.06.)...
|
||||
[JAX](https://jax.readthedocs.io/en/latest/index.html) is Autograd and XLA, brought together for high-performance numerical computing and machine learning research. It provides composable transformations of Python+NumPy programs: differentiate, vectorize, parallelize, Just-In-Time compile to GPU/TPU, and more. A great place for getting started with JAX is the [JAX 101 Tutorial](https://jax.readthedocs.io/en/latest/jax-101/index.html).
|
||||
|
||||
[Flax](https://flax.readthedocs.io/en/latest/index.html) is a high-performance neural network library designed for flexibility built on top of JAX. It aims to provide users with full control of their training code and is carefully designed to work well with JAX transformations such as `grad` and `pmap` (see the [Flax philosophy](https://flax.readthedocs.io/en/latest/philosophy.html)). For an introduction to Flax see the [Flax Basics Colab](https://flax.readthedocs.io/en/latest/notebooks/flax_basics.html) or the list of curated [Flax examples](https://flax.readthedocs.io/en/latest/examples.html).
|
||||
|
||||
## Quickstart flax and jax in transformers
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user