Download PDFOpen PDF in browserCurrent versionBertMCN: Mapping colloquial phrases to standard medical concepts using BERT and Highway NetworkEasyChair Preprint 2268, version 121 pages•Date: December 29, 2019AbstractIn the last few years, people started to share lot of information related to health in the form of tweets, reviews and blog posts. All these user generated clinical texts can be mined to generate useful insights. However, automatic analysis of clinical text requires identication of standard medical concepts. Most of the existing deep learning based systems to normalize medical concepts are based on CNN or RNN. Performance of these models is limited as they have to be trained from scratch (except embeddings). In this work, we propose a normalization system based on pre-trained BERT and highway layer. BERT, a pre-trained context sensitive language representation model advanced the state-of-the-art performance in many NLP tasks and gating mechanism in highway layer helps the model to choose only important information. Experimental results show that our model outperformed all existing methods on two standard datasets. Keyphrases: BERT, Highway layer, Natural Language Processing, Social Media Text, medical concept normalization
|