The option `--do_lower_case` is currently required by the uncased models (i.e., bert-base-uncased, bert-large-uncased).
Results:
BERT-BASE without --do_lower_case: 'exact': 73.83, 'f1': 82.22
BERT-BASE with --do_lower_case: 'exact': 81.02, 'f1': 88.34
* make transformers-cli cross-platform
Using "scripts" is a useful option in setup.py particularly when you want to get access to non-python scripts. However, in this case we want to have an entry point into some of our own Python scripts. To do this in a concise, cross-platfom way, we can use entry_points.console_scripts. This change is necessary to provide the CLI on different platforms, which "scripts" does not ensure. Usage remains the same, but the "transformers-cli" script has to be moved (be part of the library) and renamed (underscore + extension)
* make style & quality
* added LongformerForQuestionAnswering
* add LongformerForQuestionAnswering
* fix import for LongformerForMaskedLM
* add LongformerForQuestionAnswering
* hardcoded sep_token_id
* compute attention_mask if not provided
* combine global_attention_mask with attention_mask when provided
* update example in docstring
* add assert error messages, better attention combine
* add test for longformerForQuestionAnswering
* typo
* cast gloabl_attention_mask to long
* make style
* Update src/transformers/configuration_longformer.py
* Update src/transformers/configuration_longformer.py
* fix the code quality
* Merge branch 'longformer-for-question-answering' of https://github.com/patil-suraj/transformers into longformer-for-question-answering
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
* Add Type Hints to modeling_utils.py Closes#3911
Add Type Hints to methods in `modeling_utils.py`
Note: The coverage isn't 100%. Mostly skipped internal methods.
* Reformat according to `black` and `isort`
* Use typing.Iterable instead of Sequence
* Parameterize Iterable by its generic type
* Use typing.Optional when None is the default value
* Adhere to style guideline
* Update src/transformers/modeling_utils.py
* Update src/transformers/modeling_utils.py
Co-authored-by: Julien Chaumond <chaumond@gmail.com>
* Warn the user about max_len being on the path to be deprecated.
* Ensure better backward compatibility when max_len is provided to a tokenizer.
* Make sure to override the parameter and not the actual instance value.
* Format & quality
* Adds predict stage for glue tasks, and generate result files which could be submitted to gluebenchmark.com website.
* Use Split enum + always output the label name
Co-authored-by: Julien Chaumond <chaumond@gmail.com>