# YPEA: A Toolbox for Evolutionary Algorithms in MATLAB

# Introduction

**Evolutionary Algorithms**** (EAs)** and **Metaheuristics** are general-purpose tools to deal with optimization problems, mostly having a black-box objective function. These algorithms are considered as a subfield of Computational Intelligence (CI) and Artificial Intelligence (AI), and they have enormous applications in many fields of science and engineering. Evolutionary algorithms are nature-inspired and they are mostly simulating what happens in natural evolution to come up with approximate solutions for optimization problems, just like the way nature has represented us and other living organisms as solutions for the problem of living on Earth.

In nature, the main objective is to survive, and this translates into being more adaptable to the environment. In a highly-competitive environment with uncertain conditions, the winner will be the one who adapts faster and better. It is possible to see this “Survival of the Fittest” game as an optimization problem, and the overall purpose of nature and players is to find the fittest or best strategy.

Every optimization problem has at least one objective function which must be either minimized or maximized. To solve an optimization problem, the idea of using an evolutionary algorithm is to define a virtual environment and then create some artificial creatures who are competing to be the best. If all of the components in natural evolution are added to this environment, then evolution will occur and we can perform optimization. Starting from an initial population, it is possible to get better and better solutions generation by generation. That is how an evolutionary algorithm works.

Metaheuristics are a more general family of optimization algorithms, including evolutionary algorithms. Some of them are inspired by natural processes, while the others are just metaphorically based on some ideas in physics, chemistry, business, society, history, and education.

Many metaheuristics are presented by researchers in the last decades. The general structure of a metaheuristic is as follows:

**Initialization:**Create and evaluate an initial set of solutions, maybe just a single solution.**Creating New Solutions:**Based on the current population, and using processes like competition, cooperation, selection, breeding, mutation, etc., some new solutions (aka. offsprings) are created. The set of newly created solutions can be combined with current (parent) solutions in various ways. The result, maybe after a selection and competition process, will be the new current population.**Checking for Termination:**Check if there is an acceptable solution. If an acceptable solution has been found, then execution of the algorithm terminates, otherwise, the algorithm continues from step 2.

Step 2 can be very different among various algorithms. For example in Genetic Algorithms (GAs), which is a well-known evolutionary algorithm inspired by natural evolution, this step contains these operations:

- Selecting parents based on their fitness values
- Performing crossover (similar to reproduction)
- Performing mutation (similar to natural mutation)
- Combine population of parents and offsprings
- Perform competition among members of the combined population
- Select members of the new generation

As another example, in Particle Swarm Optimization (PSO), which is a Swarm Intelligence algorithm and inspired by the collective and social behavior of animals like birds and fish, step 2 consists of these operations:

- Every particle, i.e. every candidate solution in the current population, moves towards (a) the best solution ever found, (b) its own best memory, and (3) its previous moving direction.
- The new position is evaluated.
- The best memory of the particle (aka. personal best) is updated.
- The best collective memory of the whole swarm (aka. global best) is updated.

# Yarpiz Evolutionary Algorithms Toolbox

Within the last few years, I have shared many codes on Yarpiz, which implement various AI, Machine Learning, and Computational Intelligence methods, algorithms, and applications. Most of the projects are developed in MATLAB and they are also available on MATLAB File Exchange (FEX) to download.

However, there are some frequently asked questions related to the application of metaheuristics and their implementations in research projects or real-world problems. Actually, most students and researchers have difficulties defining their own problems, models, and objective functions. In many cases, we must simulate a system to calculate the key metrics and objectives for various sets of decision variables in the optimization process. This task is not straightforward and needs previous experience in implementing such projects.

A key part of the implementation is to define the decision variables of an optimization problem. In real-world problems, we usually deal with several unknown decision variables of various sizes and types. For example, a problem can have these types of variables:

- 10 binary variables,
- 8 real-valued variables in the range [-5, 5],
- 2 integer variables from set {3, 4, 5, 6},
- a 3-by-5 matrix of real-valued variables in range [0, 2],
- 5 permutations of length 20,
- a segmentation of 30 elements into 7 disjoint subsets, and
- selection of 6 out of 20 members of a set.

To create a unified approach for implementing metaheuristics and solving optimization problems using them, I developed Yarpiz Evolutionary Algorithms Toolbox for MATLAB, or in short YPEA for MATLAB. This toolbox helps researchers and developers with these:

- It enables the user to define various types of decision variables just by describing what it is. The operations for coding and decoding variables are automatically performed by the toolbox, behind the scene.
- After the variables are defined, the user can implement the objective function, using real-world variables. This makes the implementation of objective function much easier.
- It implements several metaheuristic algorithms with similar standards, and this makes it easier to apply multiple algorithms to an optimization problem without the need to modify the code. So the problem is defined once, but it can be solved multiple times by different optimizers.

## List of Algorithms

Currently, 15 metaheuristics and evolutionary algorithms are implemented in the YPEA toolbox. The list of implemented algorithms and a short description of each algorithm follows. The list is orderer alphabetically, according to the full name of the algorithms.

**Artificial Bee Colony (ABC)**This algorithm is inspired by the collective behavior of honey bees and their intelligent foraging behavior and it is proposed by Derviş Karaboğa in 2005.**Ant Colony Optimization for Continuous Domains (ACOR)**

Ant Colony Optimization (ACO), proposed by Marco Dorigo in 1992, is mainly used for discrete and combinatorial optimization problems. Socha and Dorigo proposed the continuous version in 2008, which uses a probabilistic model to find a model for better solutions.**Bees Algorithm (BA)**

This is another algorithm inspired by the intelligent foraging behavior of honey bees and it is proposed by Professor Duc Truong Pham et al. in 2005.**Biogeography-based Optimization (BBO)**

Originally developed by Dan Simon in 2008, this algorithm is based on the biogeography and distribution of species in various habitats. BBO simulates the process of immigration and emigration between territories.**Covariance Matrix Adaptation Evolution Strategy (CMA-ES)**

Evolution Strategy (ES) is a well-known evolutionary algorithm and its development dates back to the early 1960s. Up to now, many variations of ES has been proposed. CMA-ES is a probabilistic and enhanced version of ES and it is proposed by Nikolaus Hansen in the 2000s.**Cultural Algorithm (CA)**

Proposed by Robert G. Reynolds in 1994, cultural algorithms use the concept of culture, knowledge, and belief space in addition to individuals to perform evolution. In this algorithm, a belief space is used to model the common experience of the whole population. Cultural Algorithm can be considered as a sample of Social Darwinism.**Differential Evolution (DE)**

This algorithm uses the differences of individuals in the population to create new candidate solutions. There are various schemes to create new solutions, hence, DE has various versions and modifications. Differential Evolution is proposed by Rainer Storn and Kenneth Price in 1995.**Firefly Algorithm (FA)**

This algorithm is proposed and developed by Xin-She Yang in 2009 and it is inspired by the fireflies and their flashing behavior.**Genetic Algorithm (GA)**

Simulation of natural evolution was started in the early 1950s. At that time, 3 separate groups were working on the idea of creating an intelligent general-purpose tool to solve every kind of problem, especially optimization problems. The first approach resulted in the development of the Evolution Strategy (ES), the second one caused the introduction of the Evolutionary Programming (EP), and the third approach ended up with the introduction of GA. John Holland studied genetic algorithms and through his work, GAs became popular in the early 1970s.**Harmony Search (HS)**

This algorithm is inspired by the process of music composition and the approach musicians use to find better harmonies in music. Harmony Search is proposed by Zong Woo Geem et al. in 2001.**Imperialist Competitive Algorithm (ICA)**

This algorithm is a socio-political metaheuristic, inspired by the historical colonization process and competition among imperialists, to capture more and more colonies. ICA is developed and proposed by Esmaeil Atashpaz Gargari and Caro Lucas, in 2007.**Invasive Weed Optimization (IWO)**

This algorithm is proposed by Alireza Mehrabian and Caro Lucas, in 2006. IWO imitates the spreading strategy of weeds, which can be described using r/K Selection Theory.**Particle Swarm Optimization (PSO)**

PSO is inspired by the social behavior of animal groups, like fish schools and flocks of birds. James Kennedy and Russell C. Eberhart originally intended to simulate the social behavior of animals. But they ended up proposing a new optimizer algorithm in 1995, which is currently known as PSO.**Simulated Annealing (SA)**

While this algorithm is developed and proposed by several independent groups of researchers, for the first time, Scott Kirkpatrick et al. used SA to solve the Travelling Salesman Problem (TSP), in 1983. Simulated Annealing is inspired by the annealing process in metallurgy, which is a heat treatment operation to make the materials, especially metals, more flexible and strong.**Teaching-Learning-based Optimization (TLBO)**

TLBO simulates the process of teaching, learning, and passing knowledge in a classroom. The main source of learning is the teacher. However, students also can learn by interacting with each other. This algorithm is proposed by Ravipudi Venkata Rao et al. in 2011.

## List of Supported Variable Types

One of the difficulties in modeling and defining optimization problems is dealing with different kinds of decision variables. In YPEA, there several types of decision variables are supported and make the problem definition much easier.

Supported variable types in YPEA, are listed below:

**Real Variables,**which are used to model real-valued (continuous) variables, in any size (scalar, vector, matrix, etc.), with any combination of lower and upper bounds.*Example:*most real-world mathematical programming and optimization problems.**Integer Variables,**which are used to model integer-valued (discrete) variables, in any size (scalar, vector, matrix, etc.), with any combination of lower and upper bounds.*Example:*Generalized Knapsack Problem.**Binary Variables,**which are special cases of integer variables, with lower bound set to 0, and upper bound set to 1.*Example:*Standard (Binary) Knapsack Problem.**Permutation Variables,**which are used to model permutations of some items, and can include more than one permutation per variable (as rows of a matrix).*Example:*Traveling Salesman Problem (TSP), n-Queens Problem, and Quadratic Assignment Problem (QAP).**Selection Variables,**which are used to model fixed-length selections of some items, and can include more than one selection per variable (as rows of matrix).*Example:*Feature Selection.**Partition/Allocation Variables,**which are used to partition a list of items into disjoint subsets (also known as allocation and assignment), and can include more than one partitioning per variable (as rows of matrices).*Example:*Vehicle Routing Problem (VRP) and Hub Location Allocation.**Fixed Sum Variables,**which are used to model real-valued variables that are needed to have some fixed target sum value (in the whole matrix, in rows, or in columns).*Example:*Transportation Problem and Supply Chain Management.

All of these variable types are encoded to real values in the range of [0,1]. Hence, all of the algorithms implemented in YPEA can be used to solve any defined problem, interchangeably.

## Download and Installation

The source code of this YPEA toolbox is shared on GitHub, via this URL [+]. If you want to install the toolbox directly, you can download the installable MATLAB Toolbox File (*.mltbx), from this link [+].

Also, the toolbox is available for download via MATLAB File Exchange, via this link [+]. You can search for “YPEA Toolbox” in the Add-On Explorer of MATLAB, to find the toolbox and directly install it.

The toolbox comes with complete documentation and it is accessible via MATLAB help. Also, you can use the following command to see the documentation for the YPEA toolbox.

`doc ypea`

# How YPEA Works?

The main steps to set up your problem and solve it using YPEA are listed below:

**1. Define Optimization Problem**

- Initialize an instance of the optimization problem
- Set the type of optimization problem (either minimization or maximization)
- Define search space and decision variables
- Define the objective function
- Note: In the case of constrained optimization problems, constraints are defined and using methods like penalty functions, they are incorporated into the final objective function.

**2. Use Optimization Algorithm**

- Create an instance of optimization algorithm class
- Set the parameters of algorithms
- Call
`solve`

method of the algorithm, passing the problem object into it - Get the optimization results and visualize (if it is needed)

# Solving a Classic Problem

As a simple problem, we are going to optimize the well-know ‘Sphere’ function, which is defined as follows:

This function can be implemented in MATLAB using the following code:

However, YPEA comes with the definition of some benchmarks functions like the Sphere function, including Ackley, Rastrigin, and Rosenbrock. You can get a handler to any of these functions, calling `ypea_test_function`

, as follows:

We are going to minimize the Sphere function in a search space of 20 decision variables (i.e. unknown variables), and each of them in the range [-10, 10]. In the next sections, we will define the problem structure and then we will pass it to some of the algorithms available in the YPEA toolbox.

## Problem Definition

Now we will define the problem structure, specifying the type of problem, the search space, and the objective function. So we have this code to define the problem structure:

In general, the objective function accepts a solution structure, which is denoted by `sol`

in the code, and this structure contains `x`

as the actual values of decision variables in it.

Now, we are ready to use an algorithm to solve the defined optimization problem. We will use some of the solvers available in the YPEA toolbox.

## Using Genetic Algorithm

The class `ypea_ga`

in the toolbox defines the Real-coded Genetic Algorithm. The code below shows how we can use the GA to solve our optimization problem. First, we need to create an instance of ypea_ga and then, we will use it to get the problem solved.

Now we can pass the problem structure to the GA instance to find the solution:

After running this code, these lines are printed out to the Command Window in MATLAB, a line per iteration of the algorithm.

`Genetic Algorithm started ...`

Initializing population.

Iteration 1: Best this. Value = 329.689, NFE = 200

Iteration 2: Best this. Value = 313.7412, NFE = 300

Iteration 3: Best this. Value = 240.6908, NFE = 400

Iteration 4: Best this. Value = 232.9948, NFE = 500

Iteration 5: Best this. Value = 152.2398, NFE = 600

.

.

.

Iteration 995: Best this. Value = 1.5244e-09, NFE = 99600

Iteration 996: Best this. Value = 1.5244e-09, NFE = 99700

Iteration 997: Best this. Value = 1.5244e-09, NFE = 99800

Iteration 998: Best this. Value = 1.5243e-09, NFE = 99900

Iteration 999: Best this. Value = 1.5243e-09, NFE = 100000

Iteration 1000: Best this. Value = 1.5243e-09, NFE = 100100

End of optimization.

If you want to suppress the output, simply set the `display`

property of the `ga`

object to `false`

.

The final actual solution is stored in the `solution`

field of `best_sol`

structure, which is the output of `solve`

method. It is something like this:

>> best_sol.solutionans = struct with fields: x: [1×20 double]

Inside this structure, there is a field `x`

, and its value is a follows:

>> best_sol.solution.xans = 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

and the related objective value is:

>> best_sol.obj_valueans = 0

The total run time of the algorithm is given by (in seconds):

>> ga.run_timeans = 8.9737

and the total number of function evaluations is stored in `nfe`

field of the algorithm object:

>> ga.nfeans = 100100

We can illustrate the result of the optimization process by plotting the best objective value history (`ga.best_obj_value_history`

) vs. the number of function evaluations (`ga.nfe_history`

). A sample output plot, which is the output of the following code, is shown below.

The plot shows how the best solution found by GA converged to the final optimum solution of the problem. After about 85,000 function evaluations, the objective value of the best solution is almost zero, very near to the actual solution of the problem.

## Using Particle Swarm Optimization

Once a problem is defined in the YPEA format, it is possible to solve it using any other algorithm implemented and available in the toolbox. For example, instead of using GA, we can use PSO to solve the problem of finding the global minimum of the Sphere function as follows. Also, a sample of the output plot is shown after the code.

According to the results obtained and the plot, PSO has found the solution after about 50,000 function evaluations.

## Using Differential Evolution

Similarly, we can use Differential Evolution to solve the problem we have defined above. According to the conventions found in the literature of DE, we are going to use the **DE/best/2/exp** version, which means:

- the base vector of mutation is the best solution ever found,
- two vector differences are used (i.e. four random solutions contributing to the creation of mutated vector), and
- exponential crossover is utilized.

Some other available configurations are listed below:

- DE/rand/1/bin
*(default configuration in YPEA)* - DE/rand-to-best/1/exp
- DE/target-to-best/3/bin
- DE/best/1/bin
- DE/rand/5/exp

Just like GA and PSO, we can use the following code to minimize the Sphere function using DE. A sample output plot is also shown after the code. According to the plot, DE has found the solution after about 25,000 function evaluations.

# Conclusion

YEPA toolbox provides a unified approach to define optimization problems and solve them using various metaheuristics and evolutionary algorithms. This makes the process of research or developing new applications in the field of optimization much easier. Also, once a problem is defined it can be solved with several algorithms without the need to perform any modifications.

YPEA comes with several coding/encoding and representation schemes, which make it very useful to deal with various kinds of optimization problems and decision variables. So this toolbox can be applied to solve many real-world problems, including but not limited to, Combinatorial Optimization, Operational Research (OR), Scheduling problems, Quadratic Assignment Problem (QAP), Wireless Sensor Network (WSN), Transportation, Supply Chain Management (SCM), and Vehicle Routing Problem (VRP). For example, in the documentation of the toolbox, the well-known Traveling Salesman Problem (TSP) is solved using a real-coded Simulated Annealing (SA), utilizing a random-key method implemented in the toolbox to encode permutations in the domain of real numbers.

# Citing YPEA

YPEA Toolbox for MATLAB is open-source and free to use. If you are going to use the toolbox in your research projects and you want to cite that, please use the following format.

**Mostapha Kalami Heris, Yarpiz Evolutionary Algorithms Toolbox for MATLAB (YPEA), Yarpiz, 2020.**