transformers/model_cards/smanjil/German-MedBERT
smanjil ddeecf08e6
german medbert model details (#8266)
* model details

* Apply suggestions from code review

Co-authored-by: Julien Chaumond <chaumond@gmail.com>
2020-11-06 03:21:13 -05:00
..
README.md german medbert model details (#8266) 2020-11-06 03:21:13 -05:00

language
de

German Medical BERT

This is a fine-tuned model on Medical domain for German language and based on German BERT.

Overview

Language model: bert-base-german-cased

Language: German

Fine-tuning: Medical articles (diseases, symptoms, therapies, etc..)

Eval data: NTS-ICD-10 dataset (Classification)

Infrastructure: Gogle Colab

Details

  • We fine-tuned using Pytorch with Huggingface library on Colab GPU.
  • With standard parameter settings for fine-tuning as mentioned in original BERT's paper.
  • Although had to train for upto 25 epochs for classification.

Performance (Micro precision, recall and f1 score for multilabel code classification)

performance

Author

Manjil Shrestha: shresthamanjil21 [at] gmail.com

Get in touch: LinkedIn