docs(wandb): explain how to use W&B integration (#5607)

* docs(wandb): explain how to use W&B integration

fix #5262

* Also mention TensorBoard

Co-authored-by: Julien Chaumond <chaumond@gmail.com>
This commit is contained in:
Boris Dayma 2020-07-14 04:12:33 -05:00 committed by GitHub
parent cd30f98fd2
commit 4d5a8d6557
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -78,3 +78,32 @@ python examples/xla_spawn.py --num_cores 8 \
``` ```
Feedback and more use cases and benchmarks involving TPUs are welcome, please share with the community. Feedback and more use cases and benchmarks involving TPUs are welcome, please share with the community.
## Logging & Experiment tracking
You can easily log and monitor your runs code. [TensorBoard](https://www.tensorflow.org/tensorboard) and [Weights & Biases](https://docs.wandb.com/library/integrations/huggingface) are currently supported.
To use Weights & Biases, install the wandb package with:
```bash
pip install wandb
```
Then log in the command line:
```bash
wandb login
```
If you are in Jupyter or Colab, you should login with:
```python
import wandb
wandb.login()
```
Whenever you use `Trainer` or `TFTrainer` classes, your losses, evaluation metrics, model topology and gradients (for `Trainer` only) will automatically be logged.
For advanced configuration and examples, refer to the [W&B documentation](https://docs.wandb.com/library/integrations/huggingface).
When using 🤗 Transformers with PyTorch Lightning, runs can be tracked through `WandbLogger`. Refer to related [documentation & examples](https://docs.wandb.com/library/frameworks/pytorch/lightning).