Flexible Support Vector Regression and Its Application to Fault Detection
-
摘要: 支持向量回归(Support vector regression, SVR)的学习性能及泛化性能取决于参数设置.在常规方法中,这些参数以固定值形式参与运算,而当面对复杂分布的数据集时, 可能无法挑选出一组能够胜任各种分布情况的参数,参数设置需要在过拟合和欠拟合之间进行取舍. 因此,本文提出一种能够根据样本分布进行参数自我调整的柔性支持向量回归算法(Flexible support vector regression, F-SVR).该算法根据样本分布的复杂度,将训练样本划分为多个区域,在训练过程中, F-SVR为不同 区域设置不同的训练参数,有效避免了过拟合与欠拟合.本文首先采用一组人工数据对所提算法有效性进行验证,在实验中, F-SVR在 保持学习能力的同时,具备较传统方法更优秀的泛化性能.最后,本文将该算法运用至高频电源故障的实际检测,效果良好.Abstract: Hyper-parameters, which determine the ability of learning and generalization for support vector regression (SVR), are usually fixed during training. Thus when SVR is applied to complex system modeling, this parameters-fixed strategy leaves the SVR in a dilemma of selecting rigorous or slack parameters due to complicated distributions of sample dataset. Therefore, in this paper we proposed a flexible support vector regression (F-SVR) in which parameters are adaptive to sample dataset distributions during training. The method F-SVR divides the training sample dataset into several domains according to the distribution complexity, and generates a different parameter set for each domain. The efficacy of the proposed method is validated on an artificial dataset, where F-SVR yields better generalization ability than conventional SVR methods while maintaining good learning ability. Finally, we also apply F-SVR successfully to practical fault detection of a high frequency power supply.
-
Key words:
- Support vector regression (SVR) /
- flexible /
- fault detection /
- power supply
-
[1] Vapnik V N. The Nature of Statistical Learning Theory. Berlin: Springer-Verlag, 1995[2] Müller K R, Smola A, Rtsch G, Schlkopf B, Kohlmorgen J, Vapnik V. Predicting time series with support vector machines. In: Proceedings of the 7th International Conference on Artificial Neural Networks-ICANN'97. London, UK: Springer-Verlag, 1997. 999-1004[3] Yang H Q, Huang K Z, King I, Lyu M R. Localized support vector regression for time series prediction. Neurocomputing, 2009, 72(10-12): 2659-2669[4] Clarke S M, Griebsch J H, Simpson T W. Analysis of support vector regression for approximation of complex engineering analyses. Journal of Mechanical Design, 2005, 127(6): 1077-1088[5] Thukaram D, Khincha H P, Vijaynarasimha H P. Artificial neural network and support vector machine approach for locating faults in radial distribution systems. IEEE Transactions on Power Delivery, 2005, 20(2): 710-721[6] Cherkassky V, Shao X H, Mulier F M, Vapnik V N. Model complexity control for regression using VC generalization bounds. IEEE Transactions on Neural Networks, 1999, 10(5): 1075-1089[7] Vapnik V, Chapelle O. Bounds on error expectation for support vector machines. Neural Computation, 2000, 12(9): 2013-2036[8] Chapelle O, Vapnik V, Bengio Y. Model selection for small sample regression. Machine Learning, 2002, 48(1-3): 9-23[9] Schlkopf B, Bartlett P L, Smola A J, Williamson R C. Shrinking the tube: a new support vector regression algorithm. In: Proceedings of the 1998 Advances in Neural Information Processing Systems, 11. Cambridge: MIT Press, 1998[10] Schlkopf B, Smola A J, Williamson R C, Bartlett P L. New support vector algorithms. Neural Computation, 2000, 12(5): 1207-1245[11] Kwok J T, Tsang I W. Linear dependency between ε and the input noise in ε-support vector regression. IEEE Transactions on Neural Networks, 2003, 14(3): 544-553[12] Chang M W, Lin C J. Leave-one-out bounds for support vector regression model selection. Neural Computation, 2005, 17(5): 1188-1222[13] Chapelle O, Vapnik V, Bousquet O, Mukherjee S. Choosing multiple parameters for support vector machines. Machine Learning, 2002, 46(1-3): 131-159[14] stün B, Melssen W J, Oudenhuijzen M, Buydens L M C. Determination of optimal support vector regression parameters by genetic algorithms and simplex optimization. Analytica Chimica Acta, 2005, 544(1-2): 292-305[15] Pai P F, Lin C S, Hong W C, Chen C T. A hybrid support vector machine regression for exchange rate prediction. Information and Management Sciences, 2006, 17(2): 19-32[16] Tian W J, Tian Y. A new fuzzy identification approach using support vector regression and particle swarm optimization algorithm. In: Proceedings of the 2009 ISECS International Colloquium on Computing, Communication, Control, and Management, Sanya, China: IEEE, 2009. 86-90[17] Cherkassky V, Ma Y Q. Practical selection of SVM parameters and noise estimation for SVM regression. Neural Networks, 2004, 17(1): 113-126[18] Farooq T, Guergachi A, Krishnan S. Knowledge-based green's kernel for support vector regression. Mathematical Problems in Engineering, 2010, 2010:1-16[19] Zhang W F, Dai D Q, Yan H. Framelet kernels with applications to support vector regression and regularization networks. IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics, 2010, 40(4): 1128-1144[20] Apolloni B, Malchiodi D, Valerio L. Relevance regression learning with support vector machines. Nonlinear Analysis: Theory, Methods Applications, 2010, 73(9): 2855-2867[21] Hao P Y. New support vector algorithms with parametric insensitive/margin model. Neural Networks, 2010, 23(1): 60-73[22] Cristianini N, Shawe-Taylor J. An introduction to Support Vector Machines. Cambridge: Cambridge University Press, 2000[23] Bunch J R, Kaufman L. A computational method for the indefinite quadratic programming problem. Linear Algebra and Its Applications, 1980, 34: 341-370[24] Song X F, Chen W M, Chen Y J P P, Jiang B. Candidate working set strategy based SMO algorithm in support vector machine. Information Processing and Management, 2009, 45(5): 584-592[25] Yi Hui, Song Xiao-Feng, Jiang Bin, Wang Ding-Cheng. Support vector machine based on nodes refined decision directed acyclic graph and its application to fault diagnosis. Acta Automatica Sinica, 2010, 36(3): 427-432 (易辉, 宋晓峰, 姜斌, 王定成. 基于结点优化的决策导向无环图支持向量机及其在故障诊断中的应用.自动化学报, 2010, 36(3): 427-432)[26] Song X F, Chen W M, Jiang B. Sample reducing method in SVM based on K-closed sub-clusters. International Journal of Innovative Computing Information and Control, 2008, 4(7): 1751-1760[27] Ge Z Q, Gao F R, Song Z H. Batch process monitoring based on support vector data description method. Journal of Process Control, 2011, 21(6): 949-959[28] Zhou Dong-Hua, Hu Yan-Yan. Fault diagnosis techniques for dynamic systems. Acta Automatica Sinica, 2009, 35(6): 748-754 (周东华, 胡艳艳. 动态系统的故障诊断技术. 自动化学报, 2009, 35(6): 748-754)[29] Zhou Dong-Hua, Li Gang, Li Yuan. Data-Driven Fault Diagnostic Techniques for Industrial Processes: Based on PCA and PLS. Beijing: Science Press, 2011 (周东华, 李钢, 李元. 数据驱动的工业过程故障诊断技术 —— 基于主元分析与偏最小二乘的方法. 北京: 科学出版社, 2011)[30] Jiang B, Staroswiecki M, Cocquempot V. Fault accommodation for nonlinear dynamic systems. IEEE Transactions on Automatic Control, 2006, 51(9): 1578-1583[31] Maki Y, Loparo K A. A neural-network approach to fault detection and diagnosis in industrial processes. IEEE Transactions on Control Systems Technology, 1997, 5(6): 529-541[32] Zhang Y W, Teng Y D, Zhang Y. Complex process quality prediction using modified kernel partial least squares. Chemical Engineering Science, 2010, 65(6): 2153-2158[33] Zhang Y W, Chai T Y, Li Z M, Yang C Y. Modeling and monitoring of dynamic processes. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(2): 277-284[34] Zhang Y W, Zhou H, Qin S J, Chai T Y. Decentralized fault diagnosis of large-scale processes using multiblock kernel partial least squares. IEEE Transactions on Industrial Informatics, 2010, 6(1): 3-10
点击查看大图
计量
- 文章访问数: 1737
- HTML全文浏览量: 44
- PDF下载量: 851
- 被引次数: 0