Please note: This PhD seminar will be given online.
Georgios Michalopoulos, PhD candidate
David R. Cheriton School of Computer Science
Supervisors: Professors Ian McKillop and Helen Chen
Contextual word embedding models, such as BioBERT and Bio_ClinicalBERT, have achieved state-of-the-art results in biomedical natural language processing tasks by focusing their pre-training process on domain-specific corpora. However, such models do not take into consideration structured expert domain knowledge from a knowledge base.
We introduced UmlsBERT, a contextual embedding model that integrates domain knowledge during the pre-training process via a novel knowledge augmentation strategy. More specifically, the augmentation on UmlsBERT with the Unified Medical Language System (UMLS) Metathesaurus was performed in two ways: (i) connecting words that have the same underlying ‘concept’ in UMLS and (ii) leveraging semantic type knowledge in UMLS to create clinically meaningful input embeddings. By applying these two strategies, UmlsBERT can encode clinical domain knowledge into word embeddings and outperform existing domain-specific models on common named-entity recognition (NER) and clinical natural language inference tasks.
To join this PhD seminar on Zoom, please go to https://us02web.zoom.us/j/83036831105?pwd=V2cyeFFUdmEzN0JHMi95VFdtT3VvZz09.
200 University Avenue West
Waterloo, ON N2L 3G1