Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Regularized monotonic regression
Linköping University, Department of Mathematics, Optimization . Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0003-1836-4200
Linköping University, Department of Computer and Information Science, Statistics. Linköping University, Faculty of Arts and Sciences.
2016 (English)Report (Other academic)
Abstract [en]

Monotonic (isotonic) Regression (MR) is a powerful tool used for solving a wide range of important applied problems. One of its features, which poses a limitation on its use in some areas, is that it produces a piecewise constant fitted response. For smoothing the fitted response, we introduce a regularization term in the MR formulated as a least distance problem with monotonicity constraints. The resulting Smoothed Monotonic Regrassion (SMR) is a convex quadratic optimization problem. We focus on the SMR, where the set of observations is completely (linearly) ordered. Our Smoothed Pool-Adjacent-Violators (SPAV) algorithm is designed for solving the SMR. It belongs to the class of dual activeset algorithms. We proved its finite convergence to the optimal solution in, at most, n iterations, where n is the problem size. One of its advantages is that the active set is progressively enlarging by including one or, typically, more constraints per iteration. This resulted in solving large-scale SMR test problems in a few iterations, whereas the size of that problems was prohibitively too large for the conventional quadratic optimization solvers. Although the complexity of the SPAV algorithm is O(n2), its running time was growing in our computational experiments in proportion to n1:16.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2016. , 20 p.
Series
LiTH-MAT-R, ISSN 0348-2960 ; 2016:02
Keyword [en]
Monotonic regression, regularization, quadratic penalty, convex quadratic optimization, dual active-set method, large-scale optimization
National Category
Computational Mathematics Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:liu:diva-128117ISRN: LiTH-MAT-R--2016/02--SELibris ID: 19712369OAI: oai:DiVA.org:liu-128117DiVA: diva2:929073
Available from: 2016-05-17 Created: 2016-05-17 Last updated: 2016-09-28Bibliographically approved

Open Access in DiVA

Regularized Monotonic Regression(290 kB)144 downloads
File information
File name FULLTEXT01.pdfFile size 290 kBChecksum SHA-512
3f6c39cd5a417227e2218faa7fa9d244929e1899f6a47c15c0396e2253517a3715b1a38018abe8e9b9ab925e6402f7841dd3c10106abc56cfef8ffe40d421f20
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Burdakov, OlegSysoev, Oleg
By organisation
Optimization Faculty of Science & EngineeringStatisticsFaculty of Arts and Sciences
Computational MathematicsProbability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar
Total: 144 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 184 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf