2.845

2023影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于自注意力对抗的深度子空间聚类

尹明 吴浩杨 谢胜利 杨其宇

尹明, 吴浩杨, 谢胜利, 杨其宇. 基于自注意力对抗的深度子空间聚类. 自动化学报, 2022, 48(1): 271−281 doi: 10.16383/j.aas.c200302
引用本文: 尹明, 吴浩杨, 谢胜利, 杨其宇. 基于自注意力对抗的深度子空间聚类. 自动化学报, 2022, 48(1): 271−281 doi: 10.16383/j.aas.c200302
Yin Ming, Wu Hao-Yang, Xie Sheng-Li, Yang Qi-Yu. Self-attention adversarial based deep subspace clustering. Acta Automatica Sinica, 2022, 48(1): 271−281 doi: 10.16383/j.aas.c200302
Citation: Yin Ming, Wu Hao-Yang, Xie Sheng-Li, Yang Qi-Yu. Self-attention adversarial based deep subspace clustering. Acta Automatica Sinica, 2022, 48(1): 271−281 doi: 10.16383/j.aas.c200302

基于自注意力对抗的深度子空间聚类

doi: 10.16383/j.aas.c200302
基金项目: 国家自然科学基金(U1911401, 61973087, 61876042), 广东省自然科学基金(2020A1515011493), 流程工业综合自动化国家重点实验室开放课题基金项目(2020-KF-21-02)资助
详细信息
    作者简介:

    尹明:广东工业大学自动化学院教授. 主要研究方向为图像处理, 模式识别, 计算机视觉, 机器学习. E-mail: yiming@gdut.edu.cn

    吴浩杨:广东工业大学自动化学院硕士研究生.主要研究方向为子空间学习, 深度聚类. E-mail: tarkovskyfans@163.com

    谢胜利:广东工业大学自动化学院教授, IEEE Fellow. 主要研究方向为盲信号处理, 生物医学信号处理. E-mail: shlxie@gdut.edu.cn

    杨其宇:广东工业大学自动化学院讲师. 主要研究方向为信号处理, 实时数据处理. 本文通信作者 E-mail: yangqiyu@gdut.edu.cn

Self-attention Adversarial Based Deep Subspace Clustering

Funds: Supported by National Natural Science Foundation of China (U1911401, 61973087, 61876042), Guangdong Basic and Applied Basic Research Foundation (2020A1515011493), and State Key Laboratory of Synthetical Automation for Process Industries (2020-KF-21-02)
More Information
    Author Bio:

    YIN Ming Professor at the School of Automation, Guangdong University of Technology. His research interest covers image processing, pattern recognition, computer vision, and machine learning

    WU Hao-Yang Master student at the School of Automation, Guangdong University of Technology. His research interest covers subspace learning and deep clustering

    XIE Sheng-Li Professor at the School of Automation, Guangdong University of Technology, IEEE Fellow. His research interest covers blind signal processing, and biomedical signal processing

    YANG Qi-Yu Lecturer at the School of Automation, Guangdong University of Technology. His research interest covers signal processing and real time data processing. Corresponding author of this paper

  • 摘要: 子空间聚类(Subspace clustering)是一种当前较为流行的基于谱聚类的高维数据聚类框架. 近年来, 由于深度神经网络能够有效地挖掘出数据深层特征, 其研究倍受各国学者的关注. 深度子空间聚类旨在通过深度网络学习原始数据的低维特征表示, 计算出数据集的相似度矩阵, 然后利用谱聚类获得数据的最终聚类结果. 然而, 现实数据存在维度过高、数据结构复杂等问题, 如何获得更鲁棒的数据表示, 改善聚类性能, 仍是一个挑战. 因此, 本文提出基于自注意力对抗的深度子空间聚类算法(SAADSC). 利用自注意力对抗网络在自动编码器的特征学习中施加一个先验分布约束, 引导所学习的特征表示更具有鲁棒性, 从而提高聚类精度. 通过在多个数据集上的实验, 结果表明本文算法在精确率(ACC)、标准互信息(NMI)等指标上都优于目前最好的方法.
  • 图  1  深度子空间聚类网络结构图

    Fig.  1  The framework of Deep Subspace Clustering

    图  2  生成对抗网络结构图

    Fig.  2  The framework of generative adversarial networks

    图  4  基于自注意力对抗的深度子空间聚类网络框架

    Fig.  4  The framework of self-attention adversarial network based deep subspace clustering

    图  3  自注意力模块

    Fig.  3  Self-attention module

    图  5  MNIST的网络训练损失

    Fig.  5  The loss function of SAADSC during training on MNIST

    表  1  数据集信息

    Table  1  Information of the datasets

    数据集类别数量大小
    MNIST10100028×28
    FMNIST10100028×28
    COIL-2020144032×32
    YaleB38243248×32
    USPS10929816×16
    下载: 导出CSV

    表  2  参数设置

    Table  2  Parameter setting

    数据集$\lambda _1$$\lambda _2$$\lambda _3$
    MNIST10.510
    FMNIST10.0001100
    COIL-2013010
    YaleB10.0624
    USPS10.110
    下载: 导出CSV

    表  3  网络结构参数

    Table  3  Network structure parameter

    数据集卷积核大小通道数
    MNIST[5, 3, 3][10, 20, 30]
    FMNIST[5, 3, 3, 3][10, 20, 30, 40]
    COIL-20[3][15]
    YaleB[5, 3, 3][64, 128, 256]
    USPS[5, 3, 3][10, 20, 30]
    下载: 导出CSV

    表  4  5个数据集的实验结果

    Table  4  Experimental results of five datasets

    数据集YaleBCOIL-20MNISTFMNISTUSPS
    度量方法ACCNMIACCNMIACCNMIACCNMIACCNMI
    DSC-L10.96670.96870.93140.93950.72800.72170.57690.61510.69840.6765
    DSC-L20.97330.97030.93680.94080.75000.73190.58140.61330.72880.6963
    DEC**0.62840.77890.84300.80000.59000.60100.75290.7408
    DCN0.43000.63000.18890.30390.75000.74870.58670.59400.73800.7691
    StructAE0.97200.97340.93270.95660.65700.6898
    DASC0.98560.98010.96390.96860.80400.7800
    SAADSC0.98970.98560.97500.97450.95400.92810.63180.62460.78500.8134
    下载: 导出CSV

    表  5  不同先验分布的实验结果

    Table  5  Clustering results on different prior distributions

    数据集MNISTFMNISTUSPS
    度量方法ACCNMIACCNMIACCNMI
    高斯分布0.95400.92810.63180.62460.78500.8134
    伯努利分布0.93200.90430.60800.59900.77550.7917
    确定性分布0.86700.83620.55800.57900.77960.7914
    下载: 导出CSV

    表  6  SAADSC网络中不同模块的作用

    Table  6  Ablation study on SAADSC

    数据集YaleBCOIL-20MNISTFMNISTUSPS
    度量方法ACCNMIACCNMIACCNMIACCNMIACCNMI
    Test10.97250.96720.93820.94930.88200.86040.60800.61100.77480.7838
    Test20.07110.09610.42290.62630.64200.59400.53800.49170.61050.5510
    Test30.08430.12220.69930.78550.66100.67630.61400.59220.38260.3851
    Test40.97820.97020.96830.97410.95000.92750.62110.61430.78500.7986
    DSC-L20.97330.97030.93680.94080.75000.73190.58140.61330.72880.6963
    SAADSC0.98970.98560.97500.97450.95400.92810.63180.62460.78500.8134
    下载: 导出CSV

    表  7  含有噪声的COIL-20聚类结果

    Table  7  Clustering results on the noisy COIL-20

    算法SAADSCDSC-L1DSC-L2DASC
    度量方法ACCNMIACCNMIACCNMIACCNMI
    无噪声0.97500.97450.93140.93530.93680.94080.96390.9686
    10%噪声0.95900.97060.87510.89760.87140.91070.90210.9392
    20%噪声0.91110.95930.81790.87360.82860.88570.86070.9193
    30%噪声0.87080.96380.79890.85710.80720.87840.83570.9143
    40%噪声0.85690.92720.67860.78570.72500.81870.78050.8753
    下载: 导出CSV

    表  8  含有噪声的USPS聚类结果

    Table  8  Clustering results on the noisy USPS

    算法SAADSCDSC-L1DSC-L2
    度量方法ACCNMIACCNMIACCNMI
    无噪声0.78500.81340.69840.67650.72880.6963
    10%噪声0.77780.79710.67040.64280.65620.6628
    20%噪声0.77570.79010.66670.61580.65300.6429
    30%噪声0.77190.78440.63860.59870.64540.6394
    40%噪声0.76740.77500.60420.57520.63510.6164
    下载: 导出CSV
  • [1] Aggarwal C C. An Introduction to Data Classification. Data Classification: Algorithms and Applications, 2014, 125(3): 142$-$147
    [2] RMacQueen J. Some methods for classification and analysis of multivariate observations. In: Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability. 1967, 1(14): 281−297
    [3] Johnson S C. Hierarchical clustering schemes. Psychometrika, 1967, 32(3): 241-254. doi: 10.1007/BF02289588
    [4] Ng A Y, Jordan M I, Weiss Y. "On spectral clustering: Analysis and an algorithm." Advances in Neural Information Processing Systems. 2002.
    [5] Basri R, Jacobs D W. Lambertian reflectance and linear subspaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2003, 25(2): 218-233. doi: 10.1109/TPAMI.2003.1177153
    [6] Agrawal R, Gehrke J, Gunopulos D, Raghavan P. Automatic subspace clustering of high dimensional data. Data Mining and Knowledge Discovery, 2005, 11(1): 5-33. doi: 10.1007/s10618-005-1396-1
    [7] 王卫卫, 李小平, 冯象初, 王斯琪. 稀疏子空间聚类综述. 自动化学报, 2015, 41(8): 1373-1384.

    Wang Wei-Wei, Li Xiao-Ping, Feng Xiang-Chu, Wang Si-Qi. A survey on sparse subspace clustering. Acta Automatica Sinica, 2015, 41(8): 1373-1384.
    [8] Bradley P S, Mangasarian O L. K-plane clustering. Journal of Global Optimization, 2000, 16(1): 23-32. doi: 10.1023/A:1008324625522
    [9] Gear C W. Multibody grouping from motion images. International Journal of Computer Vision, 1998, 29(2): 133-150. doi: 10.1023/A:1008026310903
    [10] Yang A Y, Rao S R, Ma Y. Robust statistical estimation and segmentation of multiple subspaces. In: Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'06). IEEE, 2006.
    [11] Elhamifar E, Vidal R. Sparse subspace clustering. In: Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2009: 2790−2797
    [12] Liu G, Lin Z, Yu Y. Robust subspace segmentation by low-rank representation. In: Proceedings of the 27th International Conference on Machine Learning (ICML-10). 2010: 663−670
    [13] Luo D, Nie F, Ding C, Huang H. Multi-subspace representation and discovery. Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, Berlin, Heidelberg, 2011: 405−420
    [14] Zhuang L, Gao H, Lin Z, Ma Y, Zhang X, Yu N. Non-negative low rank and sparse graph for semi-supervised learning. In: Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2012: 2328−2335
    [15] Kang Z, Zhao X, Peng C, Zhu H, Zhou J T, Peng X, et al. Partition level multiview subspace clustering. Neural Networks, 2020, 122: 279-88. doi: 10.1016/j.neunet.2019.10.010
    [16] Kang Z, Pan H, Hoi S C, Xu Z. Robust graph learning from noisy data. IEEE Transaction on Cybernetics. 2019: 1833-1843.
    [17] 周林, 平西建, 徐森, 张涛. 基于谱聚类的聚类集成算法. 自动化学报, 2012, 38(8): 1335-1342. doi: 10.3724/SP.J.1004.2012.01335

    Zhou Lin, Ping Xi-Jian, Xu Sen Zhang Tao al. Cluster Ensemble Based on Spectral Clustering. Acta Automatica Sinica, 2012, 38(8): 1335-1342. doi: 10.3724/SP.J.1004.2012.01335
    [18] Rumelhart D E, Hinton G E, Williams R J. Learning representations by back-propagating errors. Nature, 1986, 323(6088): 533-536. doi: 10.1038/323533a0
    [19] Hinton G E, Osindero S, Teh Y W. A fast learning algorithm for deep belief nets. Neural Computation, 2006, 18(7): 1527-1554. doi: 10.1162/neco.2006.18.7.1527
    [20] Vincent P, Larochelle H, Bengio Y, Manzagol P A. Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning. 2008: 1096−1103
    [21] Bengio Y, Lamblin P, Popovici D, Larochelle H. Greedy layer-wise training of deep networks. Advances in Neural Information Processing Systems. 2007: 153-160.
    [22] Masci J, Meier U, Ciresan D, Schmidhuber J. Stacked convolutional auto-encoders for hierarchical feature extraction. In: Proceedings of the International Conference on Artificial Neural Networks, pages 52−59. Springer, 2011
    [23] Yang B, Fu X, Sidiropoulos N D, Hong, M. Towards k-means-friendly spaces: Simultaneous deep learning and clustering. In: Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR. org, 2017: 3861−3870
    [24] Shah S A, Koltun V. Deep continuous clustering [Online]. available: https://arxiv.org/abs/1803.01449, March 5, 2018.
    [25] Xie J, Girshick R, Farhadi A. Unsupervised deep embedding for clustering analysis. In: Proceedings of the 36th International Conference on Machine Learning (ICML). 2016: 478−487
    [26] Ren Y, Wang N, Li M. and Xu Z. Deep density-based image clustering. Knowledge-Based Systems. 2020: 105841.
    [27] Ren Y, Hu K, Dai X, Pan L, Hoi S C, Xu Z. Semi-supervised deep embedded clustering. Neurocomputing. 2019, 325: 121-30. doi: 10.1016/j.neucom.2018.10.016
    [28] Yang J, Parikh D, Batra D. Joint unsupervised learning of deep representations and image clusters. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2016: 5147−5156
    [29] Cho k, Van M B, Bahdanau D, Bengio Y. On the properties of neural machine translation: Encoder–decoder approaches. In: Proceedings of SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation, pages 103−111, Doha, Qatar, October 2014. ACL.
    [30] Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. In: Proceedings of SSST-8 Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation, 2014.
    [31] Xiao T, Xu Y, Yang K, Zhang J, Peng Y, Zhang Z. The application of two-level attention models in deep convolutional neural network for fine-grained image classification. In: Proceedings of the IEEE conference on computer vision and pattern recognition(CVPR). 2015: 842−850
    [32] Xu K, Ba J, Kiros R, Cho K, Courville A, Salakhudinov R, et al. Show, attend and tell: Neural image caption generation with visual attention. In: Proceedings of the 35th International Conference on Machine Learning (ICML). 2015: 2048−2057
    [33] Cheng J, Dong L, Lapata M. Long short-term memory-networks for machine reading. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2016
    [34] Luong M T, Pham H, Manning C D. Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2015.
    [35] Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, et al. Generative adversarial nets. In: Proceedings of the Neural Information Processing Systems (NIPS). 2014: 2672−2680
    [36] Chen X, Duan Y, Houthooft R, Schulman J, Sutskever I, Abbeel P. Infogan: Interpretable representation learning by information maximizing generative adversarial nets. In: Proceedings of the Neural Information Processing Systems (NIPS). 2016: 2172−2180
    [37] Mukherjee S, Asnani H, Lin E, Kannan S. Clustergan: Latent space clustering in generative adversarial networks. In: Proceedings of the AAAI Conference on Artificial Intelligence. 2019, 33: 4610−4617
    [38] Makhzani A, Shlens J, Jaitly N, Goodfellow I, Frey B. Adversarial autoencoders [Online]. available: https://arxiv.org/abs/1511.05644, May 25, 2015.
    [39] Ji P, Salzmann M, Li H. Efficient dense subspace clustering. IEEE Winter Conference on Applications of Computer Vision. IEEE, 2014: 461−468
    [40] Ji P, Zhang T, Li H, Salzmann M, Reid I. Deep subspace clustering networks. In: Proceedings of the Neural Information Processing Systems (NIPS). 2017: 24−33
    [41] 王坤峰, 苟超, 段艳杰, 林懿伦, 郑心湖, 王飞跃. 生成式对抗网络GAN的研究进展与展望. 自动化学报, 2017, 43(3): 321$-$332

    Wang Kun-Feng, Gou Chao, Duan Yan-Jie, Lin Yi-Lun, Zheng Xin-Hu, Wang Fei-Yue. Generative Adversarial Networks: The State of the Art and Beyond. Acta Automatica Sinica, 2017, 43(3): 321$-$332
    [42] Kingma D P, Welling M. Auto-encoding variational Bayes. In Proceedings of the International Conference on Learning Representations (ICLR), 2014
    [43] Zhang H, Goodfellow I, Metaxas D, Odena A. Self-attention generative adversarial networks. In: Proceedings of the 36th International Conference on Machine Learning (ICML), Long Beach, California, USA: PMLR 97, 2019. 7354−7363
    [44] Arjovsky M, Chintala S, Bottou L. Wasserstein GAN. In: Proceedings of the International Conference on Machine Learning (ICML), 2017. 214−223
    [45] Gulrajani I, Ahmed F, Arjovsky M, Dumoulin V, Courville A C. Improved training of wasserstein GANs. In: Proceedings of the Neural Information Processing Systems (NIPS). 2017: 5767−5777
    [46] Wu J, Huang Z, Thoma J, Acharya D, Van G L. Wasserstein divergence for gans. In: Proceedings of the European Conference on Computer Vision (ECCV). 2018: 653−668
    [47] LeCun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. In: Proceedings of the IEEE, 1998, 86(11): 2278−2324
    [48] Nene S A, Nayar S K, Murase H. Columbia object image library (coil-20). 1996.
    [49] Lee K C, Ho J, Kriegman D J. Acquiring linear subspaces for face recognition under variable lighting. IEEE Transactions on pattern analysis and machine intelligence, 2005, 27(5): 684-698. doi: 10.1109/TPAMI.2005.92
    [50] Kingma D P, Ba J. Adam: A method for stochastic optimization. In Proceedings of the International Conference on Learning Representations (ICLR), 2015.
    [51] Xu W, Liu X, Gong Y. Document clustering based on non-negative matrix factorization. In: Proceedings of the 26th Annual International ACM SIGIR Conference on Research and Development in Informaion Retrieval. 2003: 267−273
    [52] Peng X, Feng J, Xiao S, Yau W Y, Zhou J T, Yang S. Structured autoencoders for subspace clustering. IEEE Transactions on Image Processing, 2018, 27(10): 5076-5086. doi: 10.1109/TIP.2018.2848470
    [53] Zhou P, Hou Y, Feng J. Deep adversarial subspace clustering. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2018: 1596−1604.
  • 加载中
图(5) / 表(8)
计量
  • 文章访问数:  2169
  • HTML全文浏览量:  814
  • PDF下载量:  307
  • 被引次数: 0
出版历程
  • 收稿日期:  2020-05-12
  • 网络出版日期:  2022-01-25
  • 刊出日期:  2022-01-25

目录

    /

    返回文章
    返回