Instance Weighted Class Dependent Relief
CSTR:
Author:
  • Article
  • | |
  • Metrics
  • |
  • Reference [11]
  • |
  • Related [20]
  • | | |
  • Comments
    Abstract:

    The Relief algorithm is a filtering feature selection algorithm that maximizes the instance margins in the nearest neighbor classifier in a greedy manner. Combined with the local weight method, the authors proposed a Class Dependent RELIEF (CDRELIEF) algorithm that trains one feature weight for each category. This method can better reflect the correlation of features. However, feature weight vector are only effective for measuring the correlation of features to a certain class, and classifying them in actual classification. In the actual classification, the classification accuracy is not high enough. In order to apply the CDRELIEF algorithm to the classification process, this study changes the weight update process, and assigns an instance weight to each instance in the training set. By combining the instance weight value into the weight updating formula, the influence of data points far from the classification boundary and outliers on weight updating is excluded, thereby improving the classification accuracy. The Instance Weighted CDRELIEF (IWCDRELIEF) algorithm proposed in this study is compared with CDRELIEF algorithm on multiple UCI 2-class datasets. Experimental results show that the algorithm proposed in this study has significantly improved the CDRELIEF algorithm.

    Reference
    [1] Kira K, Rendell LA. A practical approach to feature selection. Proceedings of the 9th International Workshop on Machine Learning. Aberdeen, Scotland, UK, 1992:249-256
    [2] Robnik-Šikonja M, Kononenko I. Theoretical and empirical analysis of ReliefF and RReliefF. Machine Learning, 2003, 53(1-2):23-69
    [3] Urbanowicz RJ, Meeker M, La Cava W, et al. Relief-based feature selection:Introduction and review. Journal of Biomedical Informatics, 2018, 85:189-203.[doi:10.1016/j.jbi.2018.07.014
    [4] Sun YJ. Iterative RELIEF for feature weighting:Algorithms, theories, and applications. IEEE Transactions on Pattern Analysis and Machine Intelligenc, 2007, 29(6):1035-1051.[doi:10.1109/TPAMI.2007.1093
    [5] Marchiori E. Class dependent feature weighting and K-nearest neighbor classification. Proceedings of the 8th IAPR International Conference on Pattern Recognition in Bioinformatics. Nice, France, 2013:69-78
    [6] Pérez-Rodríguez J, Arroyo-Peña AG, García-Pedrajas N. Simultaneous instance and feature selection and weighting using evolutionary computation:Proposal and study. Applied Soft Computing, 2015, 37:416-443.[doi:10.1016/j.asoc.2015.07.046
    [7] Shu WH, Shen H. Incremental feature selection based on rough set in dynamic incomplete data. Pattern Recognition, 2014, 47(12):3890-3906.[doi:10.1016/j.patcog.2014.06.002
    [8] de Haro-García A, Pérez-Rodríguez J, García-Pedrajas N. Combining three strategies for evolutionary instance selection for instance-based learning. Swarm and Evolutionary Computation, 2018, 42:160-172.[doi:10.1016/j.swevo.2018.02.022
    [9] Zhang H, Yu J, Wang M, et al. Semi-supervised distance metric learning based on local linear regression for data clustering. Neurocomputing, 2012, 93:100-105.[doi:10.1016/j.neucom.2012.03.007
    [10] Jiao LM, Pan Q, Feng XX, et al. An evidential K-nearest neighbor classification method with weighted attributes. Proceedings of the 16th International Conference on Information Fusion. Istanbul, Turkey, 2013:145-150
    [11] Wang W, Hu BG, Wang ZF. Globality and locality incorporation in distance metric learning. Neurocomputing, 2014, 129:185-198.[doi:10.1016/j.neucom.2013.09.041
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

邱海峰,何振峰.实例加权类依赖Relief.计算机系统应用,2019,28(7):121-126

Copy
Share
Article Metrics
  • Abstract:1623
  • PDF: 2571
  • HTML: 1368
  • Cited by: 0
History
  • Received:January 19,2019
  • Revised:February 19,2019
  • Online: July 05,2019
  • Published: July 15,2019
Article QR Code
You are the first990403Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063