Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A Dual Active-Set Algorithm for Regularized Monotonic Regression
Linköping University, Department of Mathematics, Optimization . Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0003-1836-4200
Linköping University, Department of Computer and Information Science, Statistics. Linköping University, Faculty of Arts and Sciences.
2017 (English)In: Journal of Optimization Theory and Applications, ISSN 0022-3239, E-ISSN 1573-2878, Vol. 172, no 3, p. 929-949Article in journal (Refereed) Published
Abstract [en]

Monotonic (isotonic) regression is a powerful tool used for solving a wide range of important applied problems. One of its features, which poses a limitation on its use in some areas, is that it produces a piecewise constant fitted response. For smoothing the fitted response, we introduce a regularization term in the monotonic regression, formulated as a least distance problem with monotonicity constraints. The resulting smoothed monotonic regression is a convex quadratic optimization problem. We focus on the case, where the set of observations is completely (linearly) ordered. Our smoothed pool-adjacent-violators algorithm is designed for solving the regularized problem. It belongs to the class of dual active-set algorithms. We prove that it converges to the optimal solution in a finite number of iterations that does not exceed the problem size. One of its advantages is that the active set is progressively enlarging by including one or, typically, more constraints per iteration. This resulted in solving large-scale test problems in a few iterations, whereas the size of that problems was prohibitively too large for the conventional quadratic optimization solvers. Although the complexity of our algorithm grows quadratically with the problem size, we found its running time to grow almost linearly in our computational experiments.

Place, publisher, year, edition, pages
Springer, 2017. Vol. 172, no 3, p. 929-949
Keyword [en]
Large-scale optimization, Monotonic regression, Regularization, Quadratic penalty, Convex quadratic optimization, Dual active-set method
National Category
Computational Mathematics Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:liu:diva-134141DOI: 10.1007/s10957-017-1060-0ISI: 000395084600010OAI: oai:DiVA.org:liu-134141DiVA, id: diva2:1068281
Available from: 2017-01-24 Created: 2017-01-24 Last updated: 2017-04-20Bibliographically approved

Open Access in DiVA

fulltext(693 kB)23 downloads
File information
File name FULLTEXT02.pdfFile size 693 kBChecksum SHA-512
a40835f5519d9e1f19c98435584df5327143a16e7165a043fb40e455962fc3c4512a010554093c8ee526f42257428af1642f33ce11e5765fa73a85110b946d96
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Search in DiVA

By author/editor
Burdakov, OlegSysoev, Oleg
By organisation
Optimization Faculty of Science & EngineeringStatisticsFaculty of Arts and Sciences
In the same journal
Journal of Optimization Theory and Applications
Computational MathematicsProbability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar
Total: 23 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 239 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf