Skip to main content
. 2022 Dec 22;22:338. doi: 10.1186/s12911-022-02085-0

Table 2.

The different pre-trained language models considered in the evaluation, their version in the HuggingFace repository and the type of pre-training

Pre-trained Language model Version Corpus Pre-training
UMLSBert_ENG [25] Pubmed + UMLS Continual pretraining
+ weight adjustement
biobert-base-cased-v1.1 [26] PubMEd Continual pretraining
bluebert_pubmed_uncased Pubmed + MIMIC III notes Continual pretraining
_L-12_H-768_A-12 [27]
scibert_scivocab_uncased [28] Semantic Scholar From scratch
Bio_clinicalBERT [29] MIMIC III notes Continual pretraining
BERT-base-uncased [30] Wikipedia From scratch
BiomedNLP-PubMedBERT-base- Pubmed From scratch
uncased-abstract-fulltext [31]