mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
.. | ||
README.md |
datasets | tags | widget | license | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
|
MIT |
T5 for question-generation
This is t5-small model trained for answer aware question generation task. The answer text is prepended before the context text.
You can play with the model using the inference API, just get the input text in this format and see the results!
answer: answer_text context: context_text </s>
For example
answer: 42 context: 42 is the answer to life, the universe and everything. </s>
For more deatils see this repo.
Model in action 🚀
You'll need to clone the repo.
from pipelines import pipeline
nlp = pipeline("question-generation", qg_format="prepend")
nlp("42 is the answer to life, universe and everything.")
=> [{'answer': '42', 'question': 'What is the answer to life, universe and everything?'}]