[1]
|
Vapnik V N. The Nature of Statistical Learning Theory. Berlin: Springer-Verlag, 1995[2] Müller K R, Smola A, Rtsch G, Schlkopf B, Kohlmorgen J, Vapnik V. Predicting time series with support vector machines. In: Proceedings of the 7th International Conference on Artificial Neural Networks-ICANN'97. London, UK: Springer-Verlag, 1997. 999-1004[3] Yang H Q, Huang K Z, King I, Lyu M R. Localized support vector regression for time series prediction. Neurocomputing, 2009, 72(10-12): 2659-2669[4] Clarke S M, Griebsch J H, Simpson T W. Analysis of support vector regression for approximation of complex engineering analyses. Journal of Mechanical Design, 2005, 127(6): 1077-1088[5] Thukaram D, Khincha H P, Vijaynarasimha H P. Artificial neural network and support vector machine approach for locating faults in radial distribution systems. IEEE Transactions on Power Delivery, 2005, 20(2): 710-721[6] Cherkassky V, Shao X H, Mulier F M, Vapnik V N. Model complexity control for regression using VC generalization bounds. IEEE Transactions on Neural Networks, 1999, 10(5): 1075-1089[7] Vapnik V, Chapelle O. Bounds on error expectation for support vector machines. Neural Computation, 2000, 12(9): 2013-2036[8] Chapelle O, Vapnik V, Bengio Y. Model selection for small sample regression. Machine Learning, 2002, 48(1-3): 9-23[9] Schlkopf B, Bartlett P L, Smola A J, Williamson R C. Shrinking the tube: a new support vector regression algorithm. In: Proceedings of the 1998 Advances in Neural Information Processing Systems, 11. Cambridge: MIT Press, 1998[10] Schlkopf B, Smola A J, Williamson R C, Bartlett P L. New support vector algorithms. Neural Computation, 2000, 12(5): 1207-1245[11] Kwok J T, Tsang I W. Linear dependency between ε and the input noise in ε-support vector regression. IEEE Transactions on Neural Networks, 2003, 14(3): 544-553[12] Chang M W, Lin C J. Leave-one-out bounds for support vector regression model selection. Neural Computation, 2005, 17(5): 1188-1222[13] Chapelle O, Vapnik V, Bousquet O, Mukherjee S. Choosing multiple parameters for support vector machines. Machine Learning, 2002, 46(1-3): 131-159[14] stün B, Melssen W J, Oudenhuijzen M, Buydens L M C. Determination of optimal support vector regression parameters by genetic algorithms and simplex optimization. Analytica Chimica Acta, 2005, 544(1-2): 292-305[15] Pai P F, Lin C S, Hong W C, Chen C T. A hybrid support vector machine regression for exchange rate prediction. Information and Management Sciences, 2006, 17(2): 19-32[16] Tian W J, Tian Y. A new fuzzy identification approach using support vector regression and particle swarm optimization algorithm. In: Proceedings of the 2009 ISECS International Colloquium on Computing, Communication, Control, and Management, Sanya, China: IEEE, 2009. 86-90[17] Cherkassky V, Ma Y Q. Practical selection of SVM parameters and noise estimation for SVM regression. Neural Networks, 2004, 17(1): 113-126[18] Farooq T, Guergachi A, Krishnan S. Knowledge-based green's kernel for support vector regression. Mathematical Problems in Engineering, 2010, 2010:1-16[19] Zhang W F, Dai D Q, Yan H. Framelet kernels with applications to support vector regression and regularization networks. IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics, 2010, 40(4): 1128-1144[20] Apolloni B, Malchiodi D, Valerio L. Relevance regression learning with support vector machines. Nonlinear Analysis: Theory, Methods Applications, 2010, 73(9): 2855-2867[21] Hao P Y. New support vector algorithms with parametric insensitive/margin model. Neural Networks, 2010, 23(1): 60-73[22] Cristianini N, Shawe-Taylor J. An introduction to Support Vector Machines. Cambridge: Cambridge University Press, 2000[23] Bunch J R, Kaufman L. A computational method for the indefinite quadratic programming problem. Linear Algebra and Its Applications, 1980, 34: 341-370[24] Song X F, Chen W M, Chen Y J P P, Jiang B. Candidate working set strategy based SMO algorithm in support vector machine. Information Processing and Management, 2009, 45(5): 584-592[25] Yi Hui, Song Xiao-Feng, Jiang Bin, Wang Ding-Cheng. Support vector machine based on nodes refined decision directed acyclic graph and its application to fault diagnosis. Acta Automatica Sinica, 2010, 36(3): 427-432 (易辉, 宋晓峰, 姜斌, 王定成. 基于结点优化的决策导向无环图支持向量机及其在故障诊断中的应用.自动化学报, 2010, 36(3): 427-432)[26] Song X F, Chen W M, Jiang B. Sample reducing method in SVM based on K-closed sub-clusters. International Journal of Innovative Computing Information and Control, 2008, 4(7): 1751-1760[27] Ge Z Q, Gao F R, Song Z H. Batch process monitoring based on support vector data description method. Journal of Process Control, 2011, 21(6): 949-959[28] Zhou Dong-Hua, Hu Yan-Yan. Fault diagnosis techniques for dynamic systems. Acta Automatica Sinica, 2009, 35(6): 748-754 (周东华, 胡艳艳. 动态系统的故障诊断技术. 自动化学报, 2009, 35(6): 748-754)[29] Zhou Dong-Hua, Li Gang, Li Yuan. Data-Driven Fault Diagnostic Techniques for Industrial Processes: Based on PCA and PLS. Beijing: Science Press, 2011 (周东华, 李钢, 李元. 数据驱动的工业过程故障诊断技术 —— 基于主元分析与偏最小二乘的方法. 北京: 科学出版社, 2011)[30] Jiang B, Staroswiecki M, Cocquempot V. Fault accommodation for nonlinear dynamic systems. IEEE Transactions on Automatic Control, 2006, 51(9): 1578-1583[31] Maki Y, Loparo K A. A neural-network approach to fault detection and diagnosis in industrial processes. IEEE Transactions on Control Systems Technology, 1997, 5(6): 529-541[32] Zhang Y W, Teng Y D, Zhang Y. Complex process quality prediction using modified kernel partial least squares. Chemical Engineering Science, 2010, 65(6): 2153-2158[33] Zhang Y W, Chai T Y, Li Z M, Yang C Y. Modeling and monitoring of dynamic processes. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(2): 277-284[34] Zhang Y W, Zhou H, Qin S J, Chai T Y. Decentralized fault diagnosis of large-scale processes using multiblock kernel partial least squares. IEEE Transactions on Industrial Informatics, 2010, 6(1): 3-10
|