Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Automatic Instance-based Tailoring of Parameter Settings for Metaheuristics
Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Technology and Media.ORCID iD: 0000-0001-9372-3416
2011 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Many industrial problems in various fields, such as logistics, process management, orproduct design, can be formalized and expressed as optimization problems in order tomake them solvable by optimization algorithms. However, solvers that guarantee thefinding of optimal solutions (complete) can in practice be unacceptably slow. Thisis one of the reasons why approximative (incomplete) algorithms, producing near-optimal solutions under restrictions (most dominant time), are of vital importance.

Those approximative algorithms go under the umbrella term metaheuristics, each of which is more or less suitable for particular optimization problems. These algorithmsare flexible solvers that only require a representation for solutions and an evaluation function when searching the solution space for optimality.What all metaheuristics have in common is that their search is guided by certain control parameters. These parameters have to be manually set by the user andare generally problem and interdependent: A setting producing near-optimal resultsfor one problem is likely to perform worse for another. Automating the parameter setting process in a sophisticated, computationally cheap, and statistically reliable way is challenging and a significant amount of attention in the artificial intelligence and operational research communities. This activity has not yet produced any major breakthroughs concerning the utilization of problem instance knowledge or the employment of dynamic algorithm configuration.

The thesis promotes automated parameter optimization with reference to the inverse impact of problem instance diversity on the quality of parameter settings with respect to instance-algorithm pairs. It further emphasizes the similarities between static and dynamic algorithm configuration and related problems in order to show how they relate to each other. It further proposes two frameworks for instance-based algorithm configuration and evaluates the experimental results. The first is a recommender system for static configurations, combining experimental design and machine learning. The second framework can be used for static or dynamic configuration,taking advantage of the iterative nature of population-based algorithms, which is a very important sub-class of metaheuristics.

A straightforward implementation of framework one did not result in the expected improvements, supposedly because of pre-stabilization issues. The second approach shows competitive results in the scenario when compared to a state-of-the-art model-free configurator, reducing the training time by in excess of two orders of magnitude.

Place, publisher, year, edition, pages
Östersund: Mid Sweden University , 2011. , 62 p.
Series
Mid Sweden University licentiate thesis, ISSN 1652-8948 ; 67
Keyword [en]
Algorithm Configuration, Parameter Tuning, Parameter Control, Metaheuristics
National Category
Engineering and Technology
Identifiers
URN: urn:nbn:se:miun:diva-14613ISBN: 978-91-86694-48-7 (print)OAI: oai:DiVA.org:miun-14613DiVA: diva2:448377
Presentation
2011-10-14, Q221, Akademigatan 1, Östersund, 22:41 (English)
Opponent
Supervisors
Available from: 2011-10-17 Created: 2011-10-16 Last updated: 2012-08-01Bibliographically approved
List of papers
1. Recent Development in Automatic Parameter Tuning for Metaheuristics
Open this publication in new window or tab >>Recent Development in Automatic Parameter Tuning for Metaheuristics
2010 (English)In: Proceedings of the 19th Annual Conference of Doctoral Students - WDS 2010 / [ed] J. Safrankova and J. Pavlu, 2010, -10 p.Conference paper, Published paper (Refereed)
Abstract [en]

Parameter tuning is an optimization problem with the objective of finding good static pa-rameter settings before the execution of a metaheuristic on a problem at hand. The requirementof tuning multiple control parameters, combined with the stochastic nature of the algorithms,make parameter tuning a non-trivial problem. To make things worse, one parameter vector allowing the algorithm to solve all optimization problems to the best of its potential is verifiable non-existent, as can be inferred from the no free lunch theorem of optimization. Manual tuning can be conducted, with the drawback of being very time consuming and failure prone. Hence, means for automated parameter tuning are required. This paper serves as an overview about recent work within the field of automated parameter tuning.

Keyword
Parameter tuning, metaheuristics, optimization
National Category
Computer Science
Identifiers
urn:nbn:se:miun:diva-12173 (URN)
Conference
Proceedings of the 19th Annual Conference of Doctoral Students - WDS 2010
Available from: 2010-11-01 Created: 2010-11-01 Last updated: 2011-10-17Bibliographically approved
2. A Parameter Tuning Framework for Metaheuristics Based on Design of Experiments and Artificial Neural Networks
Open this publication in new window or tab >>A Parameter Tuning Framework for Metaheuristics Based on Design of Experiments and Artificial Neural Networks
2010 (English)In: Proceeding of the International Conference on Computer Mathematics and Natural Computing 2010 / [ed] B. Brojack, WASET , 2010Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, a framework for the simplification andstandardization of metaheuristic related parameter tuning by applyinga four phase methodology, utilizing Design of Experiments andArtificial Neural Networks, is presented. Metaheuristics are multipurposeproblem solvers that are utilized on computational optimizationproblems for which no efficient problem-specific algorithmexists. Their successful application to concrete problems requires thefinding of a good initial parameter setting, which is a tedious andtime-consuming task. Recent research reveals the lack of approachwhen it comes to this so called parameter tuning process. In themajority of publications, researchers do have a weak motivation fortheir respective choices, if any. Because initial parameter settingshave a significant impact on the solutions quality, this course ofaction could lead to suboptimal experimental results, and therebya fraudulent basis for the drawing of conclusions.

Place, publisher, year, edition, pages
WASET, 2010
Keyword
Parameter Tuning, Metaheuristics, Design of Experiments, Artificial Neural Networks
National Category
Other Computer and Information Science
Identifiers
urn:nbn:se:miun:diva-11420 (URN)
Conference
International Conference on Computer Mathematics and Natural Computing
Available from: 2010-08-02 Created: 2010-04-15 Last updated: 2011-10-17Bibliographically approved
3. An experimental study on robust parameter settings
Open this publication in new window or tab >>An experimental study on robust parameter settings
2010 (English)In: Proceedings of the 12th annual conference comp on Genetic and evolutionary computation, ACM Press, 2010, 1999-2002 p.Conference paper, Published paper (Refereed)
Abstract [en]

That there is no best initial parameter setting for a metaheuristicon all optimization problems is a proven fact (nofree lunch theorem). This paper studies the applicability ofso called robust parameter settings for combinatorial optimizationproblems. Design of Experiments supported parameterscreening had been carried out, analyzing a discreteParticle Swarm Optimization algorithm on three demographicallyvery dissimilar instances of the Traveling SalesmenProblem. First experimental results indicate that parametersettings produce varying performance quality forthe three instances. The robust parameter setting is outperformedin two out of three cases. The results are evensignicantly worse when considering quality/time trade-o.A methodology for problem generalization is referred to asa possible solution.

Place, publisher, year, edition, pages
ACM Press, 2010
Keyword
Experimental Design, Metaheuristics, Parameter Tuning
National Category
Computer Science
Identifiers
urn:nbn:se:miun:diva-11892 (URN)10.1145/1830761.1830844 (DOI)000322071400073 ()2-s2.0-77955953052 (Scopus ID)978-1-4503-0073-5 (ISBN)
Conference
GECCO - Genetic And Evolutionary Computation Conference - 2010
Available from: 2010-08-02 Created: 2010-08-01 Last updated: 2014-08-31Bibliographically approved
4. Iteration-wise parameter learning
Open this publication in new window or tab >>Iteration-wise parameter learning
2011 (English)In: 2011 IEEE Congress of Evolutionary Computation, CEC 2011, New Orleans, LA: IEEE conference proceedings, 2011, 455-462 p.Conference paper, Published paper (Refereed)
Abstract [en]

Adjusting the control parameters of population-based algorithms is a means for improving the quality of these algorithms' result when solving optimization problems. The difficulty lies in determining when to assign individual values to specific parameters during the run. This paper investigates the possible implications of a generic and computationally cheap approach towards parameter analysis for population-based algorithms. The effect of parameter settings was analyzed in the application of a genetic algorithm to a set of traveling salesman problem instances. The findings suggest that statistics about local changes of a search from iteration i to iteration i + 1 can provide valuable insight into the sensitivity of the algorithm to parameter values. A simple method for choosing static parameter settings has been shown to recommend settings competitive to those extracted from a state-of-the-art parameter tuner, paramlLS, with major time and setup advantages.

Place, publisher, year, edition, pages
New Orleans, LA: IEEE conference proceedings, 2011
Keyword
Algorithm Configuration, Parameter Tuning, Metaheuristics
National Category
Engineering and Technology
Identifiers
urn:nbn:se:miun:diva-14612 (URN)10.1109/CEC.2011.5949653 (DOI)000312932600063 ()2-s2.0-80052003971 (Scopus ID)978-1-4244-7834-7 (ISBN)
Conference
2011 IEEE Congress of Evolutionary Computation, CEC 2011;New Orleans, LA;5 June 2011through8 June 2011;Code86068
Note

2011 IEEE Congress of Evolutionary Computation, CEC 2011; New Orleans, LA; 5 June 2011 through 8 June 2011; Code 86068

Available from: 2011-10-16 Created: 2011-10-16 Last updated: 2014-08-31Bibliographically approved

Open Access in DiVA

Lic 67(1117 kB)538 downloads
File information
File name FULLTEXT02.pdfFile size 1117 kBChecksum SHA-512
c57243d77b6b672b65f7a114a52a1c4a84f8cef7e390f479bf2561762c5266613eaaef531f1d9190ea0a994e84332bbdd2f5b85dc7471ca432000507b0b2b695
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Dobslaw, Felix
By organisation
Department of Information Technology and Media
Engineering and Technology

Search outside of DiVA

GoogleGoogle Scholar
Total: 539 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 5857 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf