论文阅读(1)——医疗数据分析

这次分析总结的是这篇论文:

《ExactTop-k FeatureSelectionvia `2,0-NormConstraint》 XiaoCai,FeipingNie,HengHuang

文章是基于Sparse methods for machine learning。相关的补充资料,可以在这个链接找到:https://www.di.ens.fr/~fbach/Cours_peyresq_2010.pdf


Feature selection primarily addresses the problem of finding the most relevant and informative set of features.

Generally speaking, feature selection algorithms may roughly be categorized into three main families: filter, wrapper and embedded methods.

(1)In filter methods, features are pre-selected by the intrinsic properties of the data without running the learning algorithm.

(2) In wrapper methods, the process of feature selection is wrapped around the learning algorithm that will ultimately be employed and take advantage of the “feedbacks” from the learning algorithm

(3)Inembeddedmethods,featuresearchandthelearningalgorithmareincorporated into a single optimization problem


Advantages for methods in this paper:

(1)the feature selection method based on convex problem is NOT always better than its counterpart based on non-convex problem

(2)We tackle the original sparse problem with L2,0-norm constraint directly instead of its relaxation or approximation problem. Therefore, we can get a more accurate solution.

(3)We avoid the computational burden of tuning the parameter for regularization term.

(4)We are the first to provide an efficient algorithm to tackle the minimization problem of L2,1-normlosswiththe L2,0-norm constraint.


你可能感兴趣的:(论文阅读)