mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-24 23:08:57 +06:00
parent
eae8ee0389
commit
33d3072e1c
7
model_cards/lysandre/arxiv-nlp/README.md
Normal file
7
model_cards/lysandre/arxiv-nlp/README.md
Normal file
@ -0,0 +1,7 @@
|
||||
# ArXiv-NLP GPT-2 checkpoint
|
||||
|
||||
This is a GPT-2 small checkpoint for PyTorch. It is the official `gpt2-small` fine-tuned to ArXiv paper on the computational linguistics field.
|
||||
|
||||
## Training data
|
||||
|
||||
This model was trained on a subset of ArXiv papers that were parsed from PDF to txt. The resulting data is made of 80MB of text from the computational linguistics (cs.CL) field.
|
7
model_cards/lysandre/arxiv/README.md
Normal file
7
model_cards/lysandre/arxiv/README.md
Normal file
@ -0,0 +1,7 @@
|
||||
# ArXiv GPT-2 checkpoint
|
||||
|
||||
This is a GPT-2 small checkpoint for PyTorch. It is the official `gpt2-small` finetuned to ArXiv paper on physics fields.
|
||||
|
||||
## Training data
|
||||
|
||||
This model was trained on a subset of ArXiv papers that were parsed from PDF to txt. The resulting data is made of 130MB of text, mostly from quantum physics (quant-ph) and other physics sub-fields.
|
Loading…
Reference in New Issue
Block a user