Download PDFOpen PDF in browserCurrent versionRA-BLS: a Sequential BLSs Integrated with Residual Attention MechanismEasyChair Preprint 15340, version 18 pages•Date: October 31, 2024AbstractBroad Learning System (BLS) has attracted the attention of many researchers because of its excellent performance. However, The number of random nodes becomes very large when the BLS copes with large complex datasets. Based on this problem, a novel residual attention mechanism is designed and introduced into the BLS to form a sequential BLSs integrated with residual attention mechanism (RA-BLS). The goal is to shrink the network structure by correcting random enhancement nodes. The RA-BLS feeds the residuals back to the enhancement nodes to get efficient and compact feature representations, further strengthening the approximation capability. The RA-BLS drastically reduces the node requirements and further improves the performance of BLS. Finally, experimental results show that RA-BLS achieves excellent performance on UCI, MNIST, FASHION-MNIST, and NORB datasets. Notably, an accuracy of 92.33\% is achieved on the NORB dataset, marking an improvement of 2.99\% with only 17\% of the model size. The source code is available all at https://github.com/arolme/RABLS-Residual-Attention-Broad-Learning-System. Keyphrases: Attention Mechanism, Broad Learning System, Classification, machine learning, residual
|