keyboard_arrow_up
Stochastic Dual Coordinate Ascent for Learning Sign Constrained Linear Predictors

Authors

Miya Nakajima, Rikuto Mochida, Yuya Takada and Tsuyoshi Kato, Gunma University, Japan

Abstract

Sign constraints are a handy representation of domain-specific prior knowledge that can be incorporated to machine learning. Under the sign constraints, the signs of the weight coefficients for linear predictors cannot be flipped from the ones specified in advance according to the prior knowledge. This paper presents new stochastic dual coordinate ascent (SDCA) algorithms that find the minimizer of the empirical risk under the sign constraints. Generic surrogate loss functions can be plugged into the proposed algorithm with the strong convergence guarantee inherited from the vanilla SDCA. A technical contribution of this work is the finding of an efficient algorithm that performs the SDCA update with a cost linear to the number of input features which coincides with the SDCA update without the sign constraints. Eventually, the computational cost O(nd) is achieved to attain an ϵ-accuracy solution. Pattern recognition experiments were carried out using a classification task for microbiological water quality analysis. The experimental results demonstrate the powerful prediction performance of the sign constraints.

Keywords

sign constraints, convex optimization, stochastic dual coordinate ascent, empirical risk minimization, microbiological water quality analysis.

Full Text  Volume 13, Number 18