Custom errors and BatchSizeError (#13184)

* Adding custom errors and BatchSizeError for GPT2

* Adding custom errors and BatchSizeError for GPT2

* Changing Exception to BaseException

* Exception

* Adding args to Custom Exception

* Adding args to Custom Exception

* Changing from BaseException to Exception

* Changing Conditional loop syntax

* Adding Copyright info

* Handling check_code_quality

* Handling check_code_quality pt2

* Handling check_code_quality pt3

* Handling check_code_quality pt4

* Handling check_code_quality pt5

* Handling check_code_quality pt6

* Handling check_code_quality pt6

* Using black for check_code_quality

* sorting import style

* Changing

* Changing

* verified through style_doc.py

* verified through style_doc.py

* applying isort

* Removing indentation

* Changing

* Changing

* Changing

* Used ValueError

* Using ValueError

* Reformatted Style doc

* Using style doc on modeling_gp2.py

* Adding indentation

* Changing
This commit is contained in:
Ambesh Shekhar 2021-08-24 18:31:01 +05:30 committed by GitHub
parent cf57447648
commit 0512bfe79e
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -695,7 +695,8 @@ class GPT2Model(GPT2PreTrainedModel):
# GPT2Attention mask.
if attention_mask is not None:
assert batch_size > 0, "batch_size has to be defined and > 0"
if batch_size <= 0:
raise ValueError("batch_size has to be defined and > 0")
attention_mask = attention_mask.view(batch_size, -1)
# We create a 3D attention mask from a 2D tensor mask.
# Sizes are [batch_size, 1, 1, to_seq_length]