Download PDFOpen PDF in browser1DCNN-TRSNet: a Hybrid End-to-End Arrhythmia Classification Deep Network Based on TransformerEasyChair Preprint 108657 pages•Date: September 8, 2023AbstractIntelligent recognition of arrhythmias using ECG signals is crucial in the diagnosis and prevention of heart diseases. However, traditional methods for detecting arrhythmias rely on manual analysis by specialized physicians, which can be subjective and time-consuming. In this paper, we propose a novel end-to-end 1DCNN-Transformer hybrid deep learning model for the automatic recognition of arrhythmia types. Our proposed model employs a Transformer-based architecture to aggregate local contextual spatial features from a 1DCNN and incorporates temporal information in the high-level abstract features of the convolutional network output using positional encoding. The proposed deep learning model comprises a Transformer-Encoder and Transformer-Decoder that utilizes a multi-headed self-attentive mechanism to couple spatiotemporal features from different time segments and filter useful feature information. Each module converts input feature information into a higher-level abstract output, enabling the model to learn a complex abstract transformation function directly from the original ECG signal. Finally, the projected output is mapped onto the arrhythmia label space. Experimental results on the MIT-BIH Arrhythmia Database demonstrate that our proposed 1DCNN-Transformer network achieves excellent performance, with an overall average recognition accuracy of 99.46$\%$, across five categories of arrhythmia signals: normal beat (N), right bundle branch block beat (R), left bundle branch block normal beat (L), premature ventricular contraction (V), and atrial premature beat (A). Keyphrases: Arrhythmia, Classification, transformer
|