Solving the Rastrigin Function with a Genetic Algorithm in Python
This article introduces the multimodal Rastrigin function, explains its challenges for optimization, and demonstrates a complete Python implementation of a genetic algorithm—including encoding, selection, crossover, mutation, and decoding—to locate the function’s global minimum, with visual results and performance analysis.
Rastrigin Function
The Rastrigin function is a classic nonlinear multimodal benchmark with many local maxima and minima, making global minimum search difficult; it is commonly used to test optimization algorithms.
Function Plot
Genetic algorithm solving the Rastrigin minimum
<code>import numpy as np
import matplotlib.pyplot as plt
def fitness_func(X):
# objective function (fitness), X is the phenotype of the population
a = 10
pi = np.pi
x = X[:, 0]
y = X[:, 1]
return 2 * a + x ** 2 - a * np.cos(2 * pi * x) + y ** 2 - a * np.cos(2 * 3.14 * y)
def decode(x, a, b):
"""Decode genotype to phenotype"""
xt = 0
for i in range(len(x)):
xt = xt + x[i] * np.power(2, i)
return a + xt * (b - a) / (np.power(2, len(x)) - 1)
def decode_X(X: np.array):
"""Decode the whole population; decode() works on a single chromosome variable"""
X2 = np.zeros((X.shape[0], 2))
for i in range(X.shape[0]):
xi = decode(X[i, :20], -5, 5)
yi = decode(X[i, 20:], -5, 5)
X2[i, :] = np.array([xi, yi])
return X2
def select(X, fitness):
"""Select elite individuals using roulette wheel selection"""
fitness = 1 / fitness
fitness = fitness / fitness.sum()
idx = np.array(list(range(X.shape[0])))
X2_idx = np.random.choice(idx, size=X.shape[0], p=fitness)
X2 = X[X2_idx, :]
return X2
def crossover(X, c):
"""Pairwise crossover with probability c"""
for i in range(0, X.shape[0], 2):
xa = X[i, :]
xb = X[i + 1, :]
for j in range(X.shape[1]):
if np.random.rand() <= c:
xa[j], xb[j] = xb[j], xa[j]
X[i, :] = xa
X[i + 1, :] = xb
return X
def mutation(X, m):
"""Mutation operation"""
for i in range(X.shape[0]):
for j in range(X.shape[1]):
if np.random.rand() <= m:
X[i, j] = (X[i, j] + 1) % 2
return X
def ga():
"""Main genetic algorithm function"""
c = 0.3 # crossover probability
m = 0.05 # mutation probability
best_fitness = []
best_xy = []
iter_num = 100
X0 = np.random.randint(0, 2, (50, 40))
for i in range(iter_num):
X1 = decode_X(X0)
fitness = fitness_func(X1)
X2 = select(X0, fitness)
X3 = crossover(X2, c)
X4 = mutation(X3, m)
X5 = decode_X(X4)
fitness = fitness_func(X5)
best_fitness.append(fitness.min())
x, y = X5[fitness.argmin()]
best_xy.append((x, y))
X0 = X4
print("Best value: %.5f" % best_fitness[-1])
print("Best solution: x=%.5f, y=%.5f" % best_xy[-1])
plt.plot(best_fitness, color='r')
plt.show()
</code>Final result: Best value is 1.01469, best solution is x = -0.00696, y = 0.98867.
Change of the optimal solution as the number of iterations increases
Reference
《Python最优化算法实战》(苏振裕)
Model Perspective
Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.