Download PDFOpen PDF in browser

Self-Anomaly-Detection Model Training via Initialized Meta Model

EasyChair Preprint 8164

6 pagesDate: May 31, 2022

Abstract

Anomaly detection has become a key challenge affecting the training accuracy of machine learning. Because the training data is usually collected from Internet, many noised samples will be captured and these samples can decrease the model training accuracy. However, because the abnormal samples are difficult to predict when the samples are collected, and the training samples collected may contain many unknown exception categories, and the labels of normal samples may be incorrect, in this case, it is difficult to train an anomaly detection model based on supervised learning to accurately identify the anomaly samples. In this paper, we propose a new unsupervised anomaly detection method based on BiGAN, namely Rt-BiGAN, to identify the outliers in the training data. Firstly, we propose a Bigan network initialization method based on meta-learning algorithm with a small number of normal samples. Then, a self-supervised drop training is designed to improve the detection ability of the model. Finally, we evaluate our Rt-BiGAN over real-world datasets and the simulations results demonstrate that our mechanism is effective to detect the outliers in model training data.

Keyphrases: anomaly dection, meta-learning, self-training

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:8164,
  author    = {Xindi Ma and Cunzhu Su and Jianfeng Ma and Qi Jiang and Ning Xi and Sheng Gao and Kang Xie},
  title     = {Self-Anomaly-Detection Model Training via Initialized Meta Model},
  howpublished = {EasyChair Preprint 8164},
  year      = {EasyChair, 2022}}
Download PDFOpen PDF in browser