2.845

2023影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于稀疏和近邻保持的极限学习机降维

陈晓云 廖梦真

陈晓云, 廖梦真. 基于稀疏和近邻保持的极限学习机降维. 自动化学报, 2019, 45(2): 325-333. doi: 10.16383/j.aas.2018.c170216
引用本文: 陈晓云, 廖梦真. 基于稀疏和近邻保持的极限学习机降维. 自动化学报, 2019, 45(2): 325-333. doi: 10.16383/j.aas.2018.c170216
CHEN Xiao-Yun, LIAO Meng-Zhen. Dimensionality Reduction With Extreme Learning Machine Based on Sparsity and Neighborhood Preserving. ACTA AUTOMATICA SINICA, 2019, 45(2): 325-333. doi: 10.16383/j.aas.2018.c170216
Citation: CHEN Xiao-Yun, LIAO Meng-Zhen. Dimensionality Reduction With Extreme Learning Machine Based on Sparsity and Neighborhood Preserving. ACTA AUTOMATICA SINICA, 2019, 45(2): 325-333. doi: 10.16383/j.aas.2018.c170216

基于稀疏和近邻保持的极限学习机降维

doi: 10.16383/j.aas.2018.c170216
基金项目: 

国家自然科学基金 11571074

国家自然科学基金 71273053

详细信息
    作者简介:

    廖梦真   福州大学数学与计算机科学学院硕士研究生.主要研究方向为数据挖掘, 模式识别.E-mail:liao_mengzhen@163.com

    通讯作者:

    陈晓云   福州大学数学与计算机科学学院教授.主要研究方向为数据挖掘, 模式识别.本文通信作者.E-mail:c_xiaoyun@fzu.edu.cn

Dimensionality Reduction With Extreme Learning Machine Based on Sparsity and Neighborhood Preserving

Funds: 

National Science Foundation of China 11571074

National Science Foundation of China 71273053

More Information
    Author Bio:

      Master student at the College of Mathematics and Computer Science, Fuzhou University. Her research interest covers data mining and pattern recognition

    Corresponding author: CHEN Xiao-Yun    Professor at the College of Mathematics and Computer Science, Fuzhou University. Her re- search interest covers data mining and pattern recognition. Corresponding author of this paper
  • 摘要: 近邻与稀疏保持投影已被广泛应用于降维方法,通过优化得到满足近邻结构或稀疏结构的降维投影矩阵,然而这类方法多数只考虑单一结构特征.此外,多数非线性降维方法无法求出显式的映射函数,极大地限制了降维方法的应用.为克服这些问题,本文借鉴极限学习机的思想,提出面向聚类的基于稀疏和近邻保持的极限学习机降维算法(SNP-ELM).SNP-ELM算法是一种非线性无监督降维方法,在降维过程中同时考虑数据的稀疏结构与近邻结构.在人造数据、Wine数据和6个基因表达数据上进行实验,实验结果表明该算法优于其他降维方法.
    1)  本文责任编委 曾志刚
  • 图  1  ELM网络结构示意图

    Fig.  1  ELM network structure

    图  2  人造数据集

    Fig.  2  The toy dataset

    图  3  人造数据一维可视化结果

    Fig.  3  The 1D visualization results of toy dataset

    图  4  Wine数据二维可视化结果

    Fig.  4  The 2D visualization results of Wine

    图  5  将6个数据集映射到不同维度特征空间时的聚类准确率

    Fig.  5  Clustering accuracy on six gene expression data in different dimensions

    图  6  聚类准确率随参数λ的变化情况 $(\delta=\eta=-0.2) $

    Fig.  6  Variation of accuracy with respect of parameters $ \lambda (\delta=\eta=-0.2))$

    图  7  不同$\delta$和$\eta$取值下的聚类准确率$(\lambda=0.001)$

    Fig.  7  ariation of accuracy with respect of parameters $\delta $ and $ \eta(\lambda=0.001)$

    表  1  基因表达数据集描述

    Table  1  Summary of gene expression data sets

    数据集 样本数 基因数(维数) 类别数
    SRBCT 83 2 308 4
    DLBCL 77 5 469 2
    Prostate0 102 6 033 2
    Prostate 102 10 509 2
    Leukemia2 72 11 225 3
    Colon 62 2 000 2
    下载: 导出CSV

    表  2  基因数据集上聚类准确率(%)

    Table  2  Clustering accuracy comparison (variance) on gene expression data sets (%)

    Data $k$-means PCA LPP NPE SPP LLE US-ELM
    $(\lambda)$
    SNP-ELM
    $(\lambda, \eta, \delta)$
    Leukemia2 63.89 63.89 70.72 63.89 59.72 65.83 64.44 87.17
    (0.00) (0.00, 2) (3.20, 4) (0, 32) (0.00, 72) (6.65, 4) (1.34, 2) (3.56, 8)
    (0.0001) (0.0001, $-$1, $-$1)
    SRBCT 43.61 48.86 64.19 48.43 38.55 49.76 64.55 82.92
    (6.27) (2.09, 83) (2.21, 83) (0.76, 8) (0.00, 2) (4.33, 8) (10.29, 8) (6.03, 8)
    (0.1) (0.001, $-$0.4, 0)
    DLBCL 68.83 68.83 63.55 69.09 74.02 72.23 76.62 86.34
    (0.00) (0.00, 2) (1.86, 8) (0.82, 32) (0.00, 4) (0.00, 2) (0.00, 32) (1.78, 8)
    (0.0001) (0.001, 0.2, $-$0.6)
    Prostate0 56.86 56.83 56.86 56.86 59.80 56.96 64.09 82.92
    (0.00) (0.00, 2) (0.00, 2) (0.00, 4) (0.00, 102) (0.93, 4) (5.83, 2) (2.19, 102)
    (0.01) (0.1, 0.2, 0.8)
    Prostate 63.33 63.73 59.80 59.80 56.86 59.51 67.57 82.73
    (0.83) (0.00, 2) (0.00, 2) (0.00, 4) (0.00, 102) (0.93, 4) (5.83, 2) (2.19, 102)
    (0.0001) (1, $-$1, 0.6)
    Colon 54.84 54.84 54.84 56.45 64.19 59.52 67.06 85.95
    (0.00) (0.00, 2) (0.00, 2) (0.00, 2) (0.68, 62) (6.99, 32) (4.19, 32) (3.69, 8)
    (0.0001) (0.001, $-$0.8, 1)
    下载: 导出CSV
  • [1] Jolliffe I T. Principal Component Analysis. Berlin: Springer-Verlag, 2002.
    [2] He X F, Niyogi P. Locality preserving projections. In: Proceedings of 2003 Neural Information Processing Systems. Vancouver, Canada: NIPS, 2004. 153-160
    [3] He X F, Cai D, Yan S C, Zhang H J. Neighborhood preserving embedding. In: Proceedings of the 10th IEEE International Conference on Computer Vision. Beijing, China: IEEE, 2005. 1208-1213
    [4] Qiao L S, Chen S C, Tan X Y. Sparsity preserving projections with applications to face recognition. Pattern Recognition, 2010, 43(1):331-341 doi: 10.1016/j.patcog.2009.05.005
    [5] Schölkopf B, Smola A J, Müller K R. Kernel principal component analysis. In: Proceedings of the 7th International Conference on Artificial Neural Networks. Switzerland: Springer, 1997. 583-588
    [6] Roweis S T, Saul K L. Nonlinear dimensionality reduction by locally linear embedding. Science, 2010, 290(5500):2323-2326 doi: 10.1126-science.290.5500.2323/
    [7] Huang G B, Ding X J, Zhou H M. Optimization method based extreme learning machine for classification. Neurocomputing, 2010, 74(1-3):155-163 doi: 10.1016/j.neucom.2010.02.019
    [8] Peng Y, Wang S H, Long X Z, Lu B L. Discriminative graph regularized extreme learning machine and its application to face recognition. Neurocomputing, 2015, 149:340-353 doi: 10.1016/j.neucom.2013.12.065
    [9] Peng Y, Lu B L. Discriminative manifold extreme learning machine and applications to image and EEG signal classification. Neurocomputing, 2016, 174:265-277 doi: 10.1016/j.neucom.2015.03.118
    [10] Zhang K, Luo M X. Outlier-robust extreme learning machine for regression problems. Neurocomputing, 2015, 151:1519-1527 doi: 10.1016/j.neucom.2014.09.022
    [11] Huang G, Song S J, Gupta J N D, Wu C. Semi-supervised and unsupervised extreme learning machines. IEEE Transactions on Cybernetics, 2014, 44(12):2405-2417 doi: 10.1109/TCYB.2014.2307349
    [12] 刘展杰, 陈晓云.局部子空间聚类.自动化学报, 2016, 42(8):1238-1247 http://www.aas.net.cn/CN/abstract/abstract18913.shtml

    Liu Zhan-Jie, Chen Xiao-Yun. Local subspace clustering. Acta Automatica Sinica, 2016, 42(8):1238-1247 http://www.aas.net.cn/CN/abstract/abstract18913.shtml
    [13] 王卫卫, 李小平, 冯象初, 王斯琪.稀疏子空间聚类综述.自动化学报, 2015, 41(8):1373-1384 http://www.aas.net.cn/CN/abstract/abstract18712.shtml

    Wang Wei-Wei, Li Xiao-Ping, Feng Xiang-Chu, Wang Si-Qi. A survey on sparse subspace clustering. Acta Automatica Sinica, 2015, 41(8):1373-1384 http://www.aas.net.cn/CN/abstract/abstract18712.shtml
    [14] Kasun L L C, Yang Y, Huang G B, Zhang Z Y. Dimension reduction with extreme learning machine. IEEE Transactions on Image Processing, 2016, 25(8):3906-3918 doi: 10.1109/TIP.2016.2570569
    [15] Chen S S, Donoho D L, Saunders M A. Atomic decomposition by basis pursuit. SIAM Review, 2001, 43(1):129-159 doi: 10.1137/S003614450037906X
    [16] Yu L, Ding C, Loscalzo S. Stable feature selection via dense feature groups. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Las Vegas, USA: ACM, 2008. 803-811
    [17] Gene expression model selector[online], available: http://www.gems-system.org, October 9, 2017.
  • 加载中
图(7) / 表(2)
计量
  • 文章访问数:  2453
  • HTML全文浏览量:  382
  • PDF下载量:  764
  • 被引次数: 0
出版历程
  • 收稿日期:  2017-04-24
  • 录用日期:  2017-10-03
  • 刊出日期:  2019-02-20

目录

    /

    返回文章
    返回