Download PDFOpen PDF in browserCurrent version

BertMCN: Mapping colloquial phrases to standard medical concepts using BERT and Highway Network

EasyChair Preprint 2268, version 2

Versions: 123history
21 pagesDate: January 4, 2020

Abstract

In the last few years, people started to share lot of information related to health in the form of tweets, reviews and blog posts. All these user generated clinical texts can be mined to generate useful insights. However, automatic analysis of clinical text requires identication of standard medical concepts. Most of the existing deep learning based  medical concept normalization systems  are based on CNN or RNN. Performance of these models is limited as they have to be trained from scratch (except embeddings). In this work, we propose a normalization system based on pre-trained BERT and highway layer. BERT, a pre-trained context sensitive language representation model advanced the state-of-the-art performance in many NLP tasks and gating mechanism in highway layer helps the model to choose only important information. Experimental results show that our model outperformed all existing methods on two standard datasets.

Keyphrases: BERT, Highway layer, Natural Language Processing, Social Media Text, medical concept normalization

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:2268,
  author    = {Katikapalli Subramanyam Kalyan and S. Sangeetha},
  title     = {BertMCN: Mapping colloquial phrases to standard medical concepts using BERT and Highway Network},
  howpublished = {EasyChair Preprint 2268},
  year      = {EasyChair, 2020}}
Download PDFOpen PDF in browserCurrent version