An Improved Domain Multiple Kernel Support Vector Machine
-
摘要: 在支持向量机(Support vector machine, SVM)中, 对核函数的定义非常重要, 不同的核会产生不同的分类结果. 如何充分利用多个不同核函数的特点, 来共同提高SVM学习的效果, 已成为一个研究热点. 于是, 多核学习(Multiple kernel learning, MKL)方法应运而生. 最近, 有的学者提出了一种简单有效的稀疏MKL算法,即GMKL (Generalized MKL)算法, 它结合了L1 范式和L2范式的优点, 形成了一个对核权重的弹性限定. 然而, GMKL算法也并没有考虑到如何在充分利用已经选用的核函数中的共有信息. 另一方面, MultiK-MHKS算法则考虑了利用典型关联分析(Canonical correlation analysis, CCA)来获取核函数之间的共有信息, 但是却没有考虑到核函数的筛选问题. 本文模型则基于这两种算法进行了一定程度的改进, 我们称我们的算法为改进的显性多核支持向量机 (Improved domain multiple kernel support vector machine, IDMK-SVM). 我们证明了本文的模型保持了GMKL 的特性, 并且证明了算法的收敛性. 最后通过模拟实验, 本文证明了本文的多核学习方法相比于传统的多核学习方法有一定的精确性优势.Abstract: In support vector machine (SVM), it is critical to define the kernel function and a different kernel would cause different classification accuracy. People have started pursuing how to make the most use of multiple kernels harmoniously to improve the SVM performance, hence, the multiple kernel learning (MKL). Recently, an efficient generalized multiple kernel learning (GMKL) method was presented, which combines the advantages of L1-norm and L2-norm. However, the GMKL algorithm does not make the most use of the common information among the selected kernels. On the other hand, the MultiK-MHKS algorithm uses the canonical correlation analysis (CCA) to get the common information among the kernels while ignoring the selecting of kernels. So this paper tries to combine them and an improved domain multiple kernel support vector machine (IDMK-SVM) is presented. Simulation experiments demonstrate that the IDMK-SVM gets a higher classification precision than the existing typical MKL algorithms.
-
[1] Wang Hong-Qiao, Sun Fu-Chun, Cai Yan-Ning, Chen Ning, Ding Lin-Ge. On multiple kernel learning methods. Acta Automatica Sinica, 2010, 36(8): 1037-1050(汪洪桥, 孙富春, 蔡艳宁, 陈宁, 丁林阁. 多核学习方法. 自动化学报, 2010, 36(8): 1037-1050) [2] Liu Jian-Wei, Li Shuang-Cheng, Luo Xiong-Lin. Classification algorithm of support vector machine via p-norm regularization. Acta Automatica Sinica, 2012, 38(1): 76-87(刘建伟, 李双成, 罗雄麟. p范数正则化支持向量机分类算法. 自动化学报, 2012, 38(1): 76-87) [3] Ying Wen-Hao, Wang Shi-Tong, Deng Zhao-Hong, Wang Jun. Support vector machine for domain adaptation based on class distributioned. Acta Automatica Sinica, 2012, 38(12): 1-16(应文豪, 王士同, 邓赵红, 王骏. 基于类分布的领域自适应支持向量机. 自动化学报, 2012, 38(12): 1-16) [4] Liu Qiao, Qin Zhi-Guang, Chen Wei, Zhang Feng-Li. Zero-norm penalized feature selection support vector machine. Acta Automatica Sinica, 2011, 37(2): 252-256 (刘峤, 秦志光, 陈伟, 张凤荔. 基于零范数特征选择的支持向量机模型. 自动化学报, 2011, 37(2): 252-256) [5] [5] Xu Z L, Jin R, Zhu S H, Lyu M R, King I. Smooth optimization for effective multiple kernel learning. In: Proceedings of the 24th the Association for the Advancement of Artificial Intelligence. California, America: AAAI, 2010. 543-549 [6] Han Yan-Jun, Wang Jue. A bi-sparse relational learning algorithm based on multiple kernel learning. Journal of Machine Learning Research Development, 2010, 47(3): 1400- 1406(韩彦军, 王珏. 基于多核学习的双稀疏关系学习算法. 计算机研究与发展, 2010, 47(8): 1400-1406) [7] [7] Sonnenburg S, Rtsch G, Schfer C, Schlkopf B. Large scale multiple kernel learning. Journal of Machine Learning Research, 2006, 7(1): 1531-1565 [8] [8] Bach F R, Lanckriet G R G, Jordan M I. Multiple kernel learning, conic duality, and the SMO algorithm. In: Proceedings of the 21st International Conference Machine Learning. New York, USA: ACM, 2004. 6-13 [9] [9] Xu Z L, Jin R, Yang H Q, King I, Lyu M R. Simple and efficient multiple kernel learning by group lasso. In: Proceedings of the 2010 International Conference Machine Learning. Haifa, Israel: ICML, 2010. 1-8 [10] Duan L X, Tsang I W, Xu D. Domain transfer multiple kernel learning. IEEE Transactions on Pattern Analysis Machine Intelligence, 2012, 34(3): 123-131 [11] Wang Z, Chen S C, Sun T K. MultiK-MHKS: a novel multiple kernel learning algorithm. IEEE Transactions on Pattern Analysis Machine Intelligence, 2008, 30(2): 12-18 [12] Rakotomamonjy A, Bach F, Canu S, Grandvalet Y. SimpleMKL. Journal of Machine Learning Research, 2008, 9(1): 2491-2521 [13] Cortes C, Mohri M, Rostamizadeh A. L2 regularization for learning kernels. In: Proceedings of the 25th Conference on Uncertainty Artificial Intelligence. Arlington, Virginia, United States: AUAI Press, 2009. 1-8 [14] Yang H Q, Xu Z L, Ye J P, King I, Lyu M R. Efficient sparse generalized multiple kernel learning. IEEE Transactions on Neural Networks, 2011, 22(3): 433-446 [15] Hardoon D R, Szedmak S, Shawe-Taylor J. Canonical correlation analysis: an overview with application to learning methods. Neural Computering, 2004, 16(12): 2639-2664
点击查看大图
计量
- 文章访问数: 1777
- HTML全文浏览量: 123
- PDF下载量: 1220
- 被引次数: 0