transformers/examples/pytorch/text-generation
Alan Ji afa96fffdf
make run_generation more generic for other devices (#25133)
* make run_generation more generic for other devices

* use Accelerate to support any device type it supports.

* make style

* fix error usage of accelerator.prepare_model

* use `PartialState` to make sure everything is running on the right device

---------

Co-authored-by: statelesshz <jihuazhong1@huawei.com>
2023-07-28 08:20:10 -04:00
..
README.md add GPTJ/bloom/llama/opt into model list and enhance the jit support (#23291) 2023-05-24 10:57:56 +01:00
requirements.txt make run_generation more generic for other devices (#25133) 2023-07-28 08:20:10 -04:00
run_generation_contrastive_search.py make run_generation more generic for other devices (#25133) 2023-07-28 08:20:10 -04:00
run_generation.py make run_generation more generic for other devices (#25133) 2023-07-28 08:20:10 -04:00

Language generation

Based on the script run_generation.py.

Conditional text generation using the auto-regressive models of the library: GPT, GPT-2, GPTJ, Transformer-XL, XLNet, CTRL, BLOOM, LLAMA, OPT. A similar script is used for our official demo Write With Transfomer, where you can try out the different models available in the library.

Example usage:

python run_generation.py \
    --model_type=gpt2 \
    --model_name_or_path=gpt2