transformers/model_cards/valhalla/t5-samll-qg-prepend
2020-11-11 12:42:50 +01:00
..
README.md [model_cards] harmonization 2020-11-11 12:42:50 +01:00

datasets tags widget license
squad
question-generation
text
answer: 42 context: 42 is the answer to life, the universe and everything. </s>
text
answer: Guido Van Rossum context: Python is a programming language. It is developed by Guido Van Rossum. </s>
text
answer: Explicit context: Explicit is better than implicit </s>
mit

T5 for question-generation

This is t5-small model trained for answer aware question generation task. The answer text is prepended before the context text.

You can play with the model using the inference API, just get the input text in this format and see the results! answer: answer_text context: context_text </s>

For example

answer: 42 context: 42 is the answer to life, the universe and everything. </s>

For more deatils see this repo.

Model in action 🚀

You'll need to clone the repo.

Open In Colab

from pipelines import pipeline
nlp = pipeline("question-generation", qg_format="prepend")
nlp("42 is the answer to life, universe and everything.")
=> [{'answer': '42', 'question': 'What is the answer to life, universe and everything?'}]