本文已被:浏览 1887次 下载 3140次
Received:October 21, 2011 Revised:November 20, 2011
Received:October 21, 2011 Revised:November 20, 2011
中文摘要: 降维在机器学习中起着至关重要的作用。而降维的方法主要有两类:特征选择和特征提取。离散度方法是特征选择中常用的一种方法,通过计算每个特征的离散度来选择特征,被选择的特征一般都具有较高的离散度值。但是离散度的计算没有考虑到特征间的相互影响。通过改进离散度的计算,不单考虑到类间相同特征对离散度的影响,还考虑到不同特征之间的离散度影响。在UCI数据集上的实验证明,改进离散度的特征选择具有较好的性能。
Abstract:Dimension reduction is important in machine learning. The two methods of dimension reduction are feature extraction and feature selection. Scatter degree is one of the feature selection methods which attribute a degree of scattering for each feature. Features are selected that have higher scatter degree. In this paper, classification error has been reduced by considering other aspects in computing scatter degree. Experiments on UCI dataset show that improved scatter degree have a good performance on feature selection.
keywords: feature selection machine learning scatter degree pattern classification featre extraction
文章编号: 中图分类号: 文献标志码:
基金项目:国家自然科学基金(61170193)
引用文本:
兰远东,邓辉舫.一种改进离散度的特征选择方法.计算机系统应用,2012,21(7):215-218
LAN Yuan-Dong,DENG Hui-Fang.Feature Selection Method Based on Improved Scatter Degree.COMPUTER SYSTEMS APPLICATIONS,2012,21(7):215-218
兰远东,邓辉舫.一种改进离散度的特征选择方法.计算机系统应用,2012,21(7):215-218
LAN Yuan-Dong,DENG Hui-Fang.Feature Selection Method Based on Improved Scatter Degree.COMPUTER SYSTEMS APPLICATIONS,2012,21(7):215-218