[1]
|
Turney P. Types of cost in inductive concept learning. In: Proceedings of the Cost-sensitive Learning Workshop at the 17th ICML Conference. Stanford, USA: NRC, 2000. 15-21[2] Ting K M. An instance-weighting method to induce cost-sensitive trees. IEEE Transactions on Knowledge and Data Engineering, 2002, 14(3): 659-665[3] Domingos P. MetaCost: a general method for making classifiers cost-sensitive. In: Proceedings of the 5th International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 1999. 155-164[4] Elkan C. The foundations of cost-sensitive learning. In: Proceedings of the 17th International Joint Conference of Artificial Intelligence. San Francisco, USA: Morgan Kaufmann, 2001. 973-978[5] Bruka I, Kockova S. A support for decision-making: cost-sensitive learning system. Artificial Intelligence in Medicine, 1999, 6(1): 67-82[6] Zadrozny B, Langford J, Abe N. Cost-sensitive learning by cost-proportionate example weighting. In: Proceedings of the 3th IEEE International Conference on Data Mining. Washington D.C., USA: IEEE, 2003. 435-442[7] Ling C X, Sheng V S, Yang Q. Test strategies for cost-sensitive decision trees. IEEE Transactions on Knowledge and Data Engineering, 2006, 18(8): 1055-1067[8] Chai X, Deng L, Yang Q, Ling C X. Test-cost sensitive Naive Bayes classification. In: Proceedings of the 4th IEEE International Conference on Data Mining. Washington D.C., USA: IEEE, 2004. 51-58[9] Ling C X, Sheng V S. A comparative study of cost-sensitive classifiers. Chinese Journal of Computers, 2007, 30(8): 1203-1211[10] Ye Zhi-Fei, Wen Yi-Min, Lv Bao-Liang. A survey of imbalanced pattern classification problem. Caai Transactions on Intelligent Systems, 2009, 4(2): 138-156(叶志飞, 文益民, 吕宝粮. 不平衡分类问题研究综述. 智能系统学报, 2009, 4(2): 138-156)[11] Ting K M, Zheng Z. Boosting cost-sensitive trees. In: Proceedings of the 1st International Conference on Discovery Science. London, UK: Springer, 1998. 244-255[12] Fan W, Stolfo S J, Zhang J, Chan P K. AdaCost: misclassification cost-sensitive boosting. In: Proceedings of the 16th International Conference on Machine Learning. Bled, USA: Morgan Kaufmann, 1999. 97-105[13] Schapire R E, Singer Y. Improved boosting algorithms using confidence-rated predictions. Machine Learning, 1999, 37(3): 297-336[14] Yin Hui, Cao Yong-Feng, Sun Hong. Urban scene classification based multi-dimensional pyramid representation and AdaBoost using high resolution SAR images. Acta Automatica Sinica, 2010, 36(8): 1099-1106(殷慧, 曹永锋, 孙洪. 基于多维金字塔表达和AdaBoost的高分辨率SAR图像城区场景分类算法. 自动化学报, 2010, 36(8): 1099-1106)[15] Ge Jun-Feng, Luo Yu-Pin. A comprehensive study for asymmetric AdaBoost and its application in object detection. Acta Automatica Sinica, 2009, 35(11): 1403-1409[16] Fu Zhong-Liang. The effectiveness analysis of AdaBoost. Journal of Computer Research and Development, 2008, 45(10): 1747-1755(付忠良. 关于AdaBoost有效性的分析. 计算机研究与发展, 2008, 45(10): 1747-1755)[17] Fu Zhong-Liang. Effective property and best combination of classifiers linear combination. Journal of Computer Research and Development, 2009, 46(7): 1206-1216(付忠良. 分类器线性组合的有效性和最佳组合问题的研究. 计算机研究与发展, 2009, 46(7): 1206-1216)[18] Zhu J, Rosset S, Zou H, Hastie T. Multi-class AdaBoost. Statistics and Its Interface, 2009, 2(3): 349-360[19] Boutell M R, Luo J, Shen X, Brown C M. Learning multi-label scene classification. Pattern Recognition, 2004, 37(9): 1757-1771[20] Zhang M L, Zhou Z H. M3MIML: a maximum margin method for multi-instance multi-label learning. In: Proceedings of the 8th IEEE International Conference on Data Mining. Pisa, Italy, 2008. 688-697[21] Zhang M L, Zhou Z H. ML-KNN: a lazy learning approach to multi-label learning. Pattern Recognition, 2007, 40(7): 2038-2048[22] Fu Zhong-Liang, Zhao Xiang-Hui. Dynamic combination method of classifiers and ensemble learning algorithms based on classifiers combination. Journal of Sichuan University (Engineering Science), 2011, 43(2): 58-65(付忠良, 赵向辉. 分类器动态组合及基于分类器组合的集成学习算法. 四川大学学报(工程科学版), 2011, 43(2): 58-65)
|