Regularized monotonic regression
2016 (English)Report (Other academic)
Monotonic (isotonic) Regression (MR) is a powerful tool used for solving a wide range of important applied problems. One of its features, which poses a limitation on its use in some areas, is that it produces a piecewise constant fitted response. For smoothing the fitted response, we introduce a regularization term in the MR formulated as a least distance problem with monotonicity constraints. The resulting Smoothed Monotonic Regrassion (SMR) is a convex quadratic optimization problem. We focus on the SMR, where the set of observations is completely (linearly) ordered. Our Smoothed Pool-Adjacent-Violators (SPAV) algorithm is designed for solving the SMR. It belongs to the class of dual activeset algorithms. We proved its finite convergence to the optimal solution in, at most, n iterations, where n is the problem size. One of its advantages is that the active set is progressively enlarging by including one or, typically, more constraints per iteration. This resulted in solving large-scale SMR test problems in a few iterations, whereas the size of that problems was prohibitively too large for the conventional quadratic optimization solvers. Although the complexity of the SPAV algorithm is O(n2), its running time was growing in our computational experiments in proportion to n1:16.
Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2016. , 20 p.
LiTH-MAT-R, ISSN 0348-2960 ; 2016:02
Monotonic regression, regularization, quadratic penalty, convex quadratic optimization, dual active-set method, large-scale optimization
Computational Mathematics Probability Theory and Statistics
IdentifiersURN: urn:nbn:se:liu:diva-128117ISRN: LiTH-MAT-R--2016/02--SEOAI: oai:DiVA.org:liu-128117DiVA: diva2:929073