Schweitzer Fachinformationen
Wenn es um professionelles Wissen geht, ist Schweitzer Fachinformationen wegweisend. Kunden aus Recht und Beratung sowie Unternehmen, öffentliche Verwaltungen und Bibliotheken erhalten komplette Lösungen zum Beschaffen, Verwalten und Nutzen von digitalen und gedruckten Medien.
Abrar Yaqoob1*, Navneet Kumar Verma2 and Rabia Musheer Aziz1
1School of Advanced Science and Language, VIT Bhopal University, Kothrikalan, Sehore, India
2State Planning Institute (New Division), Planning Department Lucknow, Utter Pradesh, India
A potent method for resolving challenging optimization issues is provided by metaheuristic algorithms, which are heuristic optimization approaches. They provide an effective technique to explore huge solution spaces and identify close to ideal or optimal solutions. They are iterative and often inspired by natural or social processes. This study provides comprehensive information on metaheuristic algorithms and the many areas in which they are used. Heuristic optimization algorithms are well-known for their success in handling challenging optimization issues. They are a potent tool for problem-solving. Twenty well-known metaheuristic algorithms, such as the tabu search, particle swarm optimization, ant colony optimization, genetic algorithms, simulated annealing, and harmony search, are included in the article. The article extensively explores the applications of these algorithms in diverse domains such as engineering, finance, logistics, and computer science. It underscores particular instances where metaheuristic algorithms have found utility, such as optimizing structural design, controlling dynamic systems, enhancing manufacturing processes, managing supply chains, and addressing problems in artificial intelligence, data mining, and software engineering. The paper provides a thorough insight into the versatile deployment of metaheuristic algorithms across different sectors, highlighting their capacity to tackle complex optimization problems across a wide range of real-world scenarios.
Keywords: Optimization, metaheuristics, machine learning, swarm intelligence
Metaheuristics represent a category of optimization methods widely employed to tackle intricate challenges in diverse domains such as engineering, economics, computer science, and operations research. These adaptable techniques are designed to locate favorable solutions by exploring an extensive array of possibilities and avoiding stagnation in suboptimal outcomes [1]. The roots and advancement of metaheuristics can be traced back to the early 1950s when George Dantzig introduced the simplex approach for linear programming [2]. This innovative technique marked a pivotal point in optimization and paved the way for the emergence of subsequent optimization algorithms. Nonetheless, the simplex method's applicability is confined to linear programming issues and does not extend to nonlinear problems. In the latter part of the 1950s, John Holland devised the genetic algorithm, drawing inspiration from concepts of natural selection and evolution [3]. The genetic algorithm assembles a set of potential solutions and iteratively enhances this set through genetic operations like mutation, crossover, and selection [4]. The genetic algorithm was a major milestone in the development of metaheuristics and opened up new possibilities for resolving difficult optimization issues. During the 1980s and 1990s, the field of metaheuristics experienced significant expansion and the emergence of numerous novel algorithms. These techniques, which include simulated annealing (SA), tabu search (TS), ant colony optimization (ACO), particle swarm optimization (PSO), and differential evolution (DE), were created expressly to deal with a variety of optimization issues. They drew inspiration from concepts like simulated annealing, tabu search, swarm intelligence, and evolutionary algorithms [5].
The term "meta-" in metaheuristic algorithms indicates a higher level of operation beyond simple heuristics, leading to enhanced performance. These algorithms balance local search and global exploration by using randomness to provide a range of solutions. Despite the fact that metaheuristics are frequently employed, there is not a single definition of heuristics and metaheuristics in academic literature, and some academics even use the terms synonymously. However, it is currently fashionable to classify as metaheuristics all algorithms of a stochastic nature that utilize randomness and comprehensive exploration across the entire system. Metaheuristic algorithms are ideally suited for global optimization and nonlinear modeling because randomization is a useful method for switching from local to global search. As a result, almost all metaheuristic algorithms can be used to solve issues involving nonlinear optimization at the global level [6]. In recent years, the study of metaheuristics has developed over time and new algorithms are being developed that combine different concepts and techniques from various fields such as machine learning, deep learning, and data science. The development and evolution of metaheuristics have made significant contributions to solving complex optimization problems and have led to the development of powerful tools for decision-making in various domains [7]. In order to find solutions in a huge search area, metaheuristic algorithms are founded on the idea of mimicking the behaviors of natural or artificial systems. These algorithms are particularly valuable for tackling problems that are challenging or impossible to solve using traditional optimization methods. Typically, metaheuristic algorithms involve iterations and a series of steps that modify a potential solution until an acceptable one is discovered. Unlike other optimization techniques that may become stuck in local optimal solutions, metaheuristic algorithms are designed to explore the entire search space. They also exhibit resilience to noise or uncertainty in the optimization problem. The adaptability and plasticity of metaheuristic algorithms are two of their main features. They can be modified to take into account certain limitations or goals of the current task and are applicable to a wide variety of optimization situations. However, for complex problems with extensive search spaces, these algorithms may converge slowly toward an optimal solution, and there is no guarantee that they will find the global optimum. Metaheuristic algorithms find extensive application in various fields including engineering, finance, logistics, and computer science. They have been successfully employed in solving diverse problems such as optimizing design, control, and manufacturing processes, portfolio selection, and risk management strategies [8].
We shall outline some of the most popular metaheuristic methods in this section.
Genetic algorithms (GAs) fit to a cluster of metaheuristic optimization techniques that draw inspiration from natural selection and genetics [9-11]. In order to find the optimal solution for a particular issue, the core idea underlying the GA is to mimic the evolutionary process. The genetic algorithm has the capability to address challenges spanning various fields such as biology, engineering, and finance [12-14]. In the methodology of the GA, a potential solution is denoted as a chromosome, or a collection of genes. Each gene within the context of the problem signifies an individual variable, and its value corresponds to the potential range of values that the variable can take [15, 16]. Subsequently, these chromosomes undergo genetic operations like mutation and crossover. This process can give rise to a fresh population of potential solutions, resulting in a novel set of potential outcomes [17-19].
The following are the major steps in the GA:
Initialization: The algorithm initializes a set of potential responses first. A chromosome is used to symbolize each solution, which is a string of genes randomly generated based on the problem domain [20].
Evaluation: The suitability of each chromosome is assessed based on the objective function of the problem. The quality of the solution is evaluated by the fitness function, and the objective is to optimize the fitness function by either maximizing or minimizing it, depending on the particular problem [21].
Selection: Chromosomes that possess higher fitness values are chosen to form a fresh population of potential solutions. Various techniques, such as roulette wheel selection, tournament selection, and rank-based selection, are employed for the selection process [22].
Crossover: The selected chromosomes are combined through crossover to generate new offspring chromosomes. The crossover operation exchanges the genetic information from the parent chromosomes and is utilized to generate novel solutions [23].
Mutation: The offspring chromosomes are subjected to mutation, which introduces random changes to the genetic information. Mutation aids in preserving diversity within the population and preventing the occurrence of local optima [24].
Replacement: As the child chromosomes multiply, a new population of potential solutions is formed and replaces the less fit members of the prior population.
Termination: The technique proceeds to iterate through the selection, crossover, mutation, and replacement phases until a specific termination condition is satisfied. Reaching a predetermined maximum for iterations is...
Dateiformat: ePUBKopierschutz: Adobe-DRM (Digital Rights Management)
Systemvoraussetzungen:
Das Dateiformat ePUB ist sehr gut für Romane und Sachbücher geeignet – also für „fließenden” Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein „harter” Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.Bitte beachten Sie: Wir empfehlen Ihnen unbedingt nach Installation der Lese-Software diese mit Ihrer persönlichen Adobe-ID zu autorisieren!
Weitere Informationen finden Sie in unserer E-Book Hilfe.