Multi-Objective Optimization In Theory and Practice II: Metaheuristic Algorithms

Many-Objective Optimization and Parallel Computation

Author(s): Andre A. Keller

Pp: 197-220 (24)

DOI: 10.2174/9781681087054119010010

* (Excluding Mailing and Handling)

Abstract

Large-scale multi-objective problems in industrial and engineering may involve much higher dimensional problems. They may require decomposing the numerous operations to be done in the resolution process. In practice, one can be confronted with the optimization of a large number of objectives as well as with the great amount of the calculations to be made. This chapter is devoted to possible approaches to overcome these difficulties namely the so-called ‘many-objective optimization’ and ‘parallel computation’. Two major inconveniences in handling more than three (many) objectives are a decrease selection pressure to converge toward the Pareto front and a decreasing computational efficiency as the number of objectives increases. Addressing these significant difficulties may consist of adapting or changing the concept of the Pareto dominance. A stronger dominance relation will allow a better comparison of the quality of solutions. New concepts are ε-dominance, L-optimality, fuzzy dominance, preference order ranking. Some essential algorithms for manyoptimization are proposed, such as the fast hypervolume-based algorithm, the vector angle-based algorithm, the reference point-based algorithm as with NSGA-III. Test problem suites for many-objective optimization are proposed. Parallel search techniques are adapted to new computer architectures with parallel computers and distributed multiprocessor computers. Two motivations to adopt such configuration are saving computation time of complex real-world problem and the possibility to solve large-size problems. The hierarchical master-worker paradigm is a standard way to implement parallel applications. A master process dispatches specific tasks to multiple worker processes and receives the computation results back from the worker processes. Parallel computation of metaheuristic algorithms includes parallelization strategies, parallel designs, and parallel metaheuristic algorithms. Applications can show how much computation time can be saved with parallel computation.


Keywords: ε-dominance, Computational efficiency, Hypervolume-based algorithm, Many-objective optimization, Master-worker paradigm, NSGA-III, Parallel computation, Parallelization strategies, Pareto dominance, Reference point-based algorithm, Selection pressure, Vector angle-based algorithm.

Related Journals
Related Books
© 2024 Bentham Science Publishers | Privacy Policy