Computational study of the step size parameter of the subgradient optimization method
(English)Manuscript (preprint) (Other academic)
The subgradient optimization method is a simple and flexible linear programming iterative algorithm. It is much simpler than Newton's method and can be applied to a wider variety of problems. It also converges when the objective function is non-differentiable. Since an efficient algorithm will not only produce a good solution but also take less computing time, we always prefer a simpler algorithm with high quality. In this study a series of step size parameters in the subgradient equation is studied. The performance is compared for a general piecewise function and a specific p-median problem. We examine how the quality of solution changes by setting five forms of step size parameter.
subgradient method; optimization; convex function; p-median
Research subject Komplexa system - mikrodataanalys
IdentifiersURN: urn:nbn:se:du-13186OAI: oai:DiVA.org:du-13186DiVA: diva2:658495