transformers/tests/models/gpt_neox
Joao Gante 34d9409427
Llama/GPTNeoX: add RoPE scaling (#24653)
* add rope_scaling

* tmp commit

* add gptneox

* add tests

* GPTNeoX can now handle long inputs, so the pipeline test was wrong

* Update src/transformers/models/open_llama/configuration_open_llama.py

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>

* remove ntk

* remove redundant validation

---------

Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
2023-07-13 16:47:30 +01:00
..
__init__.py [WIP] Adding GPT-NeoX-20B (#16659) 2022-05-24 09:31:10 -04:00
test_modeling_gpt_neox.py Llama/GPTNeoX: add RoPE scaling (#24653) 2023-07-13 16:47:30 +01:00