Download PDFOpen PDF in browserADAGES: Adaptive Aggregation with Stability for Distributed Feature SelectionEasyChair Preprint 399924 pages•Date: August 2, 2020AbstractIn this era of big data, not only the large amount of data keeps motivating distributed computing, but concerns on data privacy also put forward the emphasis on distributed learning. To conduct feature selection and to control the false discovery rate in a distributed pattern with multi-machines or multi-institutions, an efficient aggregation method is necessary. In this paper, we propose an adaptive aggregation method called ADAGES which can be flexibly applied to any machine-wise feature selection method. We will show that our method is capable of controlling the overall FDR with a theoretical foundation while maintaining power as good as the Union aggregation rule in practice. Keyphrases: aggregation method, controlled feature selection, distributed learning, false discovery rate, stability
|