mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-29 17:22:25 +06:00
![]() * add rope_scaling * tmp commit * add gptneox * add tests * GPTNeoX can now handle long inputs, so the pipeline test was wrong * Update src/transformers/models/open_llama/configuration_open_llama.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * remove ntk * remove redundant validation --------- Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> |
||
---|---|---|
.. | ||
__init__.py | ||
test_modeling_gpt_neox.py |