A Parallel Learning Algorithm for Kernel Logistic Regression by Using Fenchel Duality
-
摘要: 给出了一种大规模核Logistic回归的并行学习算法.利用凸优化中的Fenchel对偶定理, 将核Logistic回归的优化原问题转换成对偶空间的优化问题,再利用块更新迭代方法, 可以独立地在部分数据集上进行分类器训练.设计了一个简单的客户机--服务器并行计算模式, 每个客户机对部分数据优化子问题,在一次优化结束后,服务器根据各客户机传递的信息修正各子问题目标函数. 在标准数据集的实验结果表明了基于Fenchel对偶的核Logistic回归并行学习算法的可行性.
-
关键词:
- 核Logistic回归 /
- Fenchel对偶 /
- 大规模机器学习 /
- 凸优化
Abstract: A parallel learning algorithm for solving large scale kernel logistic regression problems is presented. Primal optimization problem for the kernel logistic regression is switched to the dual problem by using Fenchel duality theory in convex optimization. Then, learning the classifiers on subsets of training data can run independently when block-update methods are employed. A simple customer-server parallel computing mode is designed that each customer node learns a sub-problem for the subset of training data. Server node receives the messages passed by all customer nodes after one optimization iteration is end, followed by updating the objective functions of sub-problems. In comparison to non-parallel learning algorithms on standardized datasets, we obtain encouraging results.
计量
- 文章访问数: 2607
- HTML全文浏览量: 59
- PDF下载量: 1215
- 被引次数: 0