差分进化算法 (Differential Evolution, DE) 是一种基于群体智能的全局优化算法,由 Storn 和 Price 在 1997 年提出。该算法通过模拟生物进化中的变异、交叉和选择操作,逐步优化目标函数。差分进化算法因其简单、高效和鲁棒性强,被广泛应用于单目标优化、多目标优化和约束优化问题。
本文将详细介绍差分进化算法的原理,并通过三个具体案例展示其在实际问题中的应用。每个案例将提供完整的 Python 实现代码、流程图以及优化曲线。
差分进化算法通过维护一个种群,利用种群中个体的差异信息生成新个体,并通过选择操作保留较优个体。其主要操作包括:
初始化:
变异:
交叉:
选择:
迭代:
求解单目标优化问题:
f ( x ) = ∑ i = 1 D x i 2 , x i ∈ [ − 5.12 , 5.12 ] f(x) = \sum_{i=1}^D x_i^2, \quad x_i \in [-5.12, 5.12] f(x)=i=1∑Dxi2,xi∈[−5.12,5.12]
import numpy as np
import matplotlib.pyplot as plt
class DifferentialEvolution:
def __init__(self, objective_func, bounds, population_size=50, max_generations=100, F=0.5, CR=0.7):
self.objective_func = objective_func
self.bounds = bounds
self.population_size = population_size
self.max_generations = max_generations
self.F = F
self.CR = CR
self.dimensions = len(bounds)
self.population = None
self.fitness = None
def initialize_population(self):
self.population = np.random.uniform(
low=[b[0] for b in self.bounds],
high=[b[1] for b in self.bounds],
size=(self.population_size, self.dimensions)
)
self.fitness = np.array([self.objective_func(ind) for ind in self.population])
def mutate(self, target_idx):
a, b, c = np.random.choice(self.population_size, 3, replace=False)
mutant = self.population[a] + self.F * (self.population[b] - self.population[c])
return mutant
def crossover(self, target_idx, mutant):
trial = np.copy(self.population[target_idx])
cross_points = np.random.rand(self.dimensions) <= self.CR
cross_points[np.random.randint(self.dimensions)] = True
trial[cross_points] = mutant[cross_points]
return trial
def select(self, target_idx, trial):
trial_fitness = self.objective_func(trial)
if trial_fitness < self.fitness[target_idx]:
self.population[target_idx] = trial
self.fitness[target_idx] = trial_fitness
def run(self):
self.initialize_population()
best_fitness_history = []
for generation in range(self.max_generations):
for i in range(self.population_size):
mutant = self.mutate(i)
trial = self.crossover(i, mutant)
self.select(i, trial)
best_fitness = np.min(self.fitness)
best_fitness_history.append(best_fitness)
print(f"Generation {generation + 1}: Best Fitness = {best_fitness}")
return best_fitness_history
# 定义目标函数
def objective_function(x):
return np.sum(x**2)
# 定义边界
bounds = [(-5.12, 5.12)] * 10
# 运行算法
de = DifferentialEvolution(objective_function, bounds, population_size=50, max_generations=100)
best_fitness_history = de.run()
# 绘制优化曲线
plt.plot(best_fitness_history)
plt.title('DE Optimization Curve for Single-Objective Problem')
plt.xlabel('Generation')
plt.ylabel('Best Fitness')
plt.show()
C:\Users\Administrator\Documents\code\yhsf>C:/software/python39/python.exe c:/Users/Administrator/Documents/code/yhsf/demo1.py
Generation 1: Best Fitness = 37.75832295895839
Generation 2: Best Fitness = 37.75832295895839
Generation 3: Best Fitness = 37.665206407771855
Generation 4: Best Fitness = 27.868288924606585
Generation 5: Best Fitness = 26.32768920670396
Generation 6: Best Fitness = 26.32768920670396
Generation 7: Best Fitness = 20.333746275357953
Generation 8: Best Fitness = 14.640315469596102
Generation 9: Best Fitness = 14.640315469596102
Generation 10: Best Fitness = 14.640315469596102
Generation 11: Best Fitness = 12.07330648333177
Generation 12: Best Fitness = 11.33071787589119
Generation 13: Best Fitness = 9.961137625541191
Generation 14: Best Fitness = 7.3371710631556954
Generation 15: Best Fitness = 7.3371710631556954
Generation 16: Best Fitness = 7.3371710631556954
Generation 17: Best Fitness = 7.3371710631556954
Generation 18: Best Fitness = 7.3371710631556954
Generation 19: Best Fitness = 6.040988956976091
Generation 20: Best Fitness = 3.502271255117573
Generation 21: Best Fitness = 3.502271255117573
Generation 22: Best Fitness = 2.0100850049774004
Generation 23: Best Fitness = 2.0100850049774004
Generation 24: Best Fitness = 2.0100850049774004
Generation 25: Best Fitness = 2.0100850049774004
Generation 26: Best Fitness = 2.0100850049774004
Generation 27: Best Fitness = 1.8327010452494987
Generation 28: Best Fitness = 1.8327010452494987
Generation 29: Best Fitness = 1.8327010452494987
Generation 30: Best Fitness = 1.8327010452494987
Generation 31: Best Fitness = 1.0823501355050198
Generation 32: Best Fitness = 1.0823501355050198
Generation 33: Best Fitness = 0.7566949878867334
Generation 34: Best Fitness = 0.7050031372515128
Generation 35: Best Fitness = 0.7050031372515128
Generation 36: Best Fitness = 0.7050031372515128
Generation 37: Best Fitness = 0.7050031372515128
Generation 38: Best Fitness = 0.4489938937381844
Generation 39: Best Fitness = 0.4489938937381844
Generation 40: Best Fitness = 0.4489938937381844
Generation 41: Best Fitness = 0.4104254249116195
Generation 42: Best Fitness = 0.4104254249116195
Generation 43: Best Fitness = 0.39368510716020266
Generation 44: Best Fitness = 0.20057292653320077
Generation 45: Best Fitness = 0.20057292653320077
Generation 46: Best Fitness = 0.1939199488334692
Generation 47: Best Fitness = 0.1400472313221252
Generation 48: Best Fitness = 0.1400472313221252
Generation 49: Best Fitness = 0.1400472313221252
Generation 50: Best Fitness = 0.1320916710296219
Generation 51: Best Fitness = 0.07043025264280887
Generation 52: Best Fitness = 0.06222145354290613
Generation 53: Best Fitness = 0.06222145354290613
Generation 54: Best Fitness = 0.06222145354290613
Generation 55: Best Fitness = 0.06222145354290613
Generation 56: Best Fitness = 0.05449857652107404
Generation 57: Best Fitness = 0.04995916356983875
Generation 58: Best Fitness = 0.04028161421729717
Generation 59: Best Fitness = 0.03731392548135427
Generation 60: Best Fitness = 0.03731392548135427
Generation 61: Best Fitness = 0.03731392548135427
Generation 62: Best Fitness = 0.029695078475206256
Generation 63: Best Fitness = 0.029695078475206256
Generation 64: Best Fitness = 0.02874881687043088
Generation 65: Best Fitness = 0.026175715813639245
Generation 66: Best Fitness = 0.017562954399615596
Generation 67: Best Fitness = 0.017562954399615596
Generation 68: Best Fitness = 0.016972776360096666
Generation 69: Best Fitness = 0.016972776360096666
Generation 70: Best Fitness = 0.016972776360096666
Generation 71: Best Fitness = 0.009258187202784015
Generation 72: Best Fitness = 0.00910336154656241
Generation 73: Best Fitness = 0.007704343149247564
Generation 74: Best Fitness = 0.007704343149247564
Generation 75: Best Fitness = 0.007704343149247564
Generation 76: Best Fitness = 0.007704343149247564
Generation 77: Best Fitness = 0.0023885895039290873
Generation 78: Best Fitness = 0.0023885895039290873
Generation 79: Best Fitness = 0.0023885895039290873
Generation 80: Best Fitness = 0.0023885895039290873
Generation 81: Best Fitness = 0.0023885895039290873
Generation 82: Best Fitness = 0.0023885895039290873
Generation 83: Best Fitness = 0.0023885895039290873
Generation 84: Best Fitness = 0.0015221196180357124
Generation 85: Best Fitness = 0.0015221196180357124
Generation 86: Best Fitness = 0.0012952681972181876
Generation 87: Best Fitness = 0.0008420065176953722
Generation 88: Best Fitness = 0.0007367066254967593
Generation 89: Best Fitness = 0.0007367066254967593
Generation 90: Best Fitness = 0.000465154526028501
Generation 91: Best Fitness = 0.000465154526028501
Generation 92: Best Fitness = 0.000465154526028501
Generation 93: Best Fitness = 0.0004563256318120555
Generation 94: Best Fitness = 0.00044701222773503466
Generation 95: Best Fitness = 0.00044701222773503466
Generation 96: Best Fitness = 0.00044701222773503466
Generation 97: Best Fitness = 0.00033614584190847397
Generation 98: Best Fitness = 0.00021517569214993674
Generation 99: Best Fitness = 0.00021517569214993674
Generation 100: Best Fitness = 0.00021517569214993674
求解多目标优化问题:
f 1 ( x ) = ∑ i = 1 D x i 2 , f 2 ( x ) = ∑ i = 1 D ( x i − 2 ) 2 f_1(x) = \sum_{i=1}^D x_i^2, \quad f_2(x) = \sum_{i=1}^D (x_i - 2)^2 f1(x)=i=1∑Dxi2,f2(x)=i=1∑D(xi−2)2
class DifferentialEvolutionMO:
def __init__(self, objective_funcs, bounds, population_size=50, max_generations=100, F=0.5, CR=0.7):
self.objective_funcs = objective_funcs
self.bounds = bounds
self.population_size = population_size
self.max_generations = max_generations
self.F = F
self.CR = CR
self.dimensions = len(bounds)
self.population = None
self.fitness = None
def initialize_population(self):
self.population = np.random.uniform(
low=[b[0] for b in self.bounds],
high=[b[1] for b in self.bounds],
size=(self.population_size, self.dimensions)
)
self.fitness = np.array([self.evaluate_fitness(ind) for ind in self.population])
def evaluate_fitness(self, individual):
return np.array([func(individual) for func in self.objective_funcs])
def dominates(self, a, b):
return np.all(a <= b) and np.any(a < b)
def mutate(self, target_idx):
a, b, c = np.random.choice(self.population_size, 3, replace=False)
mutant = self.population[a] + self.F * (self.population[b] - self.population[c])
return mutant
def crossover(self, target_idx, mutant):
trial = np.copy(self.population[target_idx])
cross_points = np.random.rand(self.dimensions) <= self.CR
cross_points[np.random.randint(self.dimensions)] = True
trial[cross_points] = mutant[cross_points]
return trial
def select(self, target_idx, trial):
trial_fitness = self.evaluate_fitness(trial)
if self.dominates(trial_fitness, self.fitness[target_idx]):
self.population[target_idx] = trial
self.fitness[target_idx] = trial_fitness
def run(self):
self.initialize_population()
best_fitness_history = []
for generation in range(self.max_generations):
for i in range(self.population_size):
mutant = self.mutate(i)
trial = self.crossover(i, mutant)
self.select(i, trial)
best_fitness = np.min(self.fitness, axis=0)
best_fitness_history.append(best_fitness)
print(f"Generation {generation + 1}: Best Fitness = {best_fitness}")
return best_fitness_history
# 定义目标函数
def objective_function_1(x):
return np.sum(x**2)
def objective_function_2(x):
return np.sum((x - 2)**2)
# 定义边界
bounds = [(-5.12, 5.12)] * 10
# 运行算法
de_mo = DifferentialEvolutionMO([objective_function_1, objective_function_2], bounds, population_size=50, max_generations=100)
best_fitness_history = de_mo.run()
# 绘制优化曲线
plt.plot(best_fitness_history)
plt.title('DE Optimization Curve for Multi-Objective Problem')
plt.xlabel('Generation')
plt.ylabel('Best Fitness')
plt.legend(['Objective 1', 'Objective 2'])
plt.show()
C:\Users\Administrator\Documents\code\yhsf>C:/software/python39/python.exe c:/Users/Administrator/Documents/code/yhsf/demo1.py
Generation 1: Best Fitness = [25.45725123 48.65198317]
Generation 2: Best Fitness = [25.45725123 48.65198317]
Generation 3: Best Fitness = [21.13878637 19.96978815]
Generation 4: Best Fitness = [21.13878637 19.96978815]
Generation 5: Best Fitness = [21.13878637 19.96978815]
Generation 6: Best Fitness = [21.13878637 19.96978815]
Generation 7: Best Fitness = [20.19557877 19.96978815]
Generation 8: Best Fitness = [20.19557877 19.96978815]
Generation 9: Best Fitness = [20.19557877 19.96978815]
Generation 10: Best Fitness = [20.19557877 19.96978815]
Generation 11: Best Fitness = [20.19557877 19.96978815]
Generation 12: Best Fitness = [12.59289352 13.44349443]
Generation 13: Best Fitness = [12.59289352 13.44349443]
Generation 14: Best Fitness = [12.59289352 13.44349443]
Generation 15: Best Fitness = [12.59289352 13.44349443]
Generation 16: Best Fitness = [12.59289352 13.44349443]
Generation 17: Best Fitness = [12.59289352 13.44349443]
Generation 18: Best Fitness = [12.59289352 13.44349443]
Generation 19: Best Fitness = [12.59289352 13.44349443]
Generation 20: Best Fitness = [12.59289352 13.44349443]
Generation 21: Best Fitness = [ 9.17982724 13.44349443]
Generation 22: Best Fitness = [ 9.17982724 11.66864116]
Generation 23: Best Fitness = [ 9.17982724 11.66864116]
Generation 24: Best Fitness = [ 9.17982724 11.66864116]
Generation 25: Best Fitness = [ 9.17982724 11.12152932]
Generation 26: Best Fitness = [ 9.17982724 11.12152932]
Generation 27: Best Fitness = [ 9.17982724 11.12152932]
Generation 28: Best Fitness = [ 6.90670347 11.12152932]
Generation 29: Best Fitness = [ 6.90670347 11.12152932]
Generation 30: Best Fitness = [ 6.90670347 11.12152932]
Generation 31: Best Fitness = [ 6.90670347 11.12152932]
Generation 32: Best Fitness = [6.90670347 6.3422328 ]
Generation 33: Best Fitness = [6.90670347 6.3422328 ]
Generation 34: Best Fitness = [6.90670347 6.3422328 ]
Generation 35: Best Fitness = [6.90670347 6.3422328 ]
Generation 36: Best Fitness = [6.90670347 6.3422328 ]
Generation 37: Best Fitness = [6.90670347 6.3422328 ]
Generation 38: Best Fitness = [6.90670347 6.33369816]
Generation 39: Best Fitness = [6.90670347 6.33369816]
Generation 40: Best Fitness = [5.58744792 6.33369816]
Generation 41: Best Fitness = [5.58744792 6.33369816]
Generation 42: Best Fitness = [5.58744792 6.33369816]
Generation 43: Best Fitness = [5.58744792 6.33369816]
Generation 44: Best Fitness = [2.30680838 6.33369816]
Generation 45: Best Fitness = [2.30680838 6.33369816]
Generation 46: Best Fitness = [2.30680838 6.33369816]
Generation 47: Best Fitness = [2.30680838 6.33369816]
Generation 48: Best Fitness = [2.30680838 6.33369816]
Generation 49: Best Fitness = [2.30680838 6.33369816]
Generation 50: Best Fitness = [2.30680838 5.39630504]
Generation 51: Best Fitness = [2.30680838 5.39630504]
Generation 52: Best Fitness = [2.30680838 5.39630504]
Generation 53: Best Fitness = [2.30680838 5.39630504]
Generation 54: Best Fitness = [2.30680838 5.39630504]
Generation 55: Best Fitness = [2.30680838 5.39630504]
Generation 56: Best Fitness = [2.30680838 5.39630504]
Generation 57: Best Fitness = [2.30680838 5.39630504]
Generation 58: Best Fitness = [2.30680838 5.39630504]
Generation 59: Best Fitness = [2.30680838 5.39630504]
Generation 60: Best Fitness = [2.30680838 5.39630504]
Generation 61: Best Fitness = [2.30680838 5.39630504]
Generation 62: Best Fitness = [2.30680838 5.39630504]
Generation 63: Best Fitness = [2.30680838 5.39630504]
Generation 64: Best Fitness = [2.30680838 5.39630504]
Generation 65: Best Fitness = [2.30680838 5.39630504]
Generation 66: Best Fitness = [2.30680838 5.39630504]
Generation 67: Best Fitness = [2.30680838 5.03874297]
Generation 68: Best Fitness = [2.30680838 5.03874297]
Generation 69: Best Fitness = [2.30680838 5.03874297]
Generation 70: Best Fitness = [2.30680838 5.03874297]
Generation 71: Best Fitness = [2.30680838 5.03874297]
Generation 72: Best Fitness = [2.30680838 5.03874297]
Generation 73: Best Fitness = [2.30680838 5.03874297]
Generation 74: Best Fitness = [2.30680838 5.03874297]
Generation 75: Best Fitness = [2.30680838 5.03874297]
Generation 76: Best Fitness = [2.30680838 5.03874297]
Generation 77: Best Fitness = [2.30680838 5.03874297]
Generation 78: Best Fitness = [2.30680838 5.03874297]
Generation 79: Best Fitness = [2.30680838 5.03874297]
Generation 80: Best Fitness = [2.30680838 5.03874297]
Generation 81: Best Fitness = [2.30680838 5.03874297]
Generation 82: Best Fitness = [2.30680838 5.03874297]
Generation 83: Best Fitness = [2.30680838 5.03874297]
Generation 84: Best Fitness = [2.30680838 5.03874297]
Generation 85: Best Fitness = [2.30680838 5.03874297]
Generation 86: Best Fitness = [2.30680838 5.03874297]
Generation 87: Best Fitness = [2.30680838 5.03874297]
Generation 88: Best Fitness = [2.30680838 5.03874297]
Generation 89: Best Fitness = [2.30680838 5.03874297]
Generation 90: Best Fitness = [2.30680838 5.03874297]
Generation 91: Best Fitness = [2.30680838 5.03874297]
Generation 92: Best Fitness = [2.30680838 5.03874297]
Generation 93: Best Fitness = [2.30680838 5.03874297]
Generation 94: Best Fitness = [2.30680838 5.03874297]
Generation 95: Best Fitness = [2.30680838 5.03874297]
Generation 96: Best Fitness = [2.30680838 5.03874297]
Generation 97: Best Fitness = [2.30680838 5.03874297]
Generation 98: Best Fitness = [2.30680838 5.03874297]
Generation 99: Best Fitness = [2.30680838 5.03874297]
Generation 100: Best Fitness = [2.30680838 5.03874297]
求解约束优化问题:
f ( x ) = ∑ i = 1 D x i 2 , subject to ∑ i = 1 D x i ≥ 1 f(x) = \sum_{i=1}^D x_i^2, \quad \text{subject to } \sum_{i=1}^D x_i \geq 1 f(x)=i=1∑Dxi2,subject to i=1∑Dxi≥1
class DifferentialEvolutionCO:
def __init__(self, objective_func, constraints, bounds, population_size=50, max_generations=100, F=0.5, CR=0.7):
self.objective_func = objective_func
self.constraints = constraints
self.bounds = bounds
self.population_size = population_size
self.max_generations = max_generations
self.F = F
self.CR = CR
self.dimensions = len(bounds)
self.population = None
self.fitness = None
def initialize_population(self):
self.population = np.random.uniform(
low=[b[0] for b in self.bounds],
high=[b[1] for b in self.bounds],
size=(self.population_size, self.dimensions)
)
self.fitness = np.array([self.evaluate_fitness(ind) for ind in self.population])
def evaluate_fitness(self, individual):
penalty = 0
for constraint in self.constraints:
if not constraint(individual):
penalty += 1e6
return self.objective_func(individual) + penalty
def mutate(self, target_idx):
a, b, c = np.random.choice(self.population_size, 3, replace=False)
mutant = self.population[a] + self.F * (self.population[b] - self.population[c])
return mutant
def crossover(self, target_idx, mutant):
trial = np.copy(self.population[target_idx])
cross_points = np.random.rand(self.dimensions) <= self.CR
cross_points[np.random.randint(self.dimensions)] = True
trial[cross_points] = mutant[cross_points]
return trial
def select(self, target_idx, trial):
trial_fitness = self.evaluate_fitness(trial)
if trial_fitness < self.fitness[target_idx]:
self.population[target_idx] = trial
self.fitness[target_idx] = trial_fitness
def run(self):
self.initialize_population()
best_fitness_history = []
for generation in range(self.max_generations):
for i in range(self.population_size):
mutant = self.mutate(i)
trial = self.crossover(i, mutant)
self.select(i, trial)
best_fitness = np.min(self.fitness)
best_fitness_history.append(best_fitness)
print(f"Generation {generation + 1}: Best Fitness = {best_fitness}")
return best_fitness_history
# 定义目标函数
def objective_function(x):
return np.sum(x**2)
# 定义约束条件
def constraint(x):
return np.sum(x) >= 1
# 定义边界
bounds = [(-5.12, 5.12)] * 10
# 运行算法
de_co = DifferentialEvolutionCO(objective_function, [constraint], bounds, population_size=50, max_generations=100)
best_fitness_history = de_co.run()
# 绘制优化曲线
plt.plot(best_fitness_history)
plt.title('DE Optimization Curve for Constrained Problem')
plt.xlabel('Generation')
plt.ylabel('Best Fitness')
plt.show()
C:\Users\Administrator\Documents\code\yhsf>C:/software/python39/python.exe c:/Users/Administrator/Documents/code/yhsf/demo1.py
Generation 1: Best Fitness = 32.99933243129078
Generation 2: Best Fitness = 32.99933243129078
Generation 3: Best Fitness = 32.99933243129078
Generation 4: Best Fitness = 32.99933243129078
Generation 5: Best Fitness = 32.99933243129078
Generation 6: Best Fitness = 28.869891258772203
Generation 7: Best Fitness = 28.869891258772203
Generation 8: Best Fitness = 21.579197677845894
Generation 9: Best Fitness = 21.579197677845894
Generation 10: Best Fitness = 18.781799864391274
Generation 11: Best Fitness = 18.781799864391274
Generation 12: Best Fitness = 18.781799864391274
Generation 13: Best Fitness = 18.781799864391274
Generation 14: Best Fitness = 18.781799864391274
Generation 15: Best Fitness = 12.49175493602986
Generation 16: Best Fitness = 12.49175493602986
Generation 17: Best Fitness = 12.49175493602986
Generation 18: Best Fitness = 8.697898462844833
Generation 19: Best Fitness = 8.697898462844833
Generation 20: Best Fitness = 8.697898462844833
Generation 21: Best Fitness = 7.156641746679249
Generation 22: Best Fitness = 4.264690941132197
Generation 23: Best Fitness = 4.264690941132197
Generation 24: Best Fitness = 4.264690941132197
Generation 25: Best Fitness = 4.264690941132197
Generation 26: Best Fitness = 4.264690941132197
Generation 27: Best Fitness = 4.264690941132197
Generation 28: Best Fitness = 4.264690941132197
Generation 29: Best Fitness = 4.264690941132197
Generation 30: Best Fitness = 4.264690941132197
Generation 31: Best Fitness = 2.1346101781232574
Generation 32: Best Fitness = 2.1346101781232574
Generation 33: Best Fitness = 2.1346101781232574
Generation 34: Best Fitness = 2.1346101781232574
Generation 35: Best Fitness = 2.1346101781232574
Generation 36: Best Fitness = 1.3793268597939867
Generation 37: Best Fitness = 1.3793268597939867
Generation 38: Best Fitness = 1.3793268597939867
Generation 39: Best Fitness = 1.3793268597939867
Generation 40: Best Fitness = 1.3793268597939867
Generation 41: Best Fitness = 1.3793268597939867
Generation 42: Best Fitness = 1.3793268597939867
Generation 43: Best Fitness = 1.2739335060133878
Generation 44: Best Fitness = 1.2739335060133878
Generation 45: Best Fitness = 0.4275280759003961
Generation 46: Best Fitness = 0.4275280759003961
Generation 47: Best Fitness = 0.4275280759003961
Generation 48: Best Fitness = 0.4275280759003961
Generation 49: Best Fitness = 0.4275280759003961
Generation 50: Best Fitness = 0.4275280759003961
Generation 51: Best Fitness = 0.4275280759003961
Generation 52: Best Fitness = 0.4275280759003961
Generation 53: Best Fitness = 0.4275280759003961
Generation 54: Best Fitness = 0.4275280759003961
Generation 55: Best Fitness = 0.4275280759003961
Generation 56: Best Fitness = 0.4275280759003961
Generation 57: Best Fitness = 0.4203439018696698
Generation 58: Best Fitness = 0.4203439018696698
Generation 59: Best Fitness = 0.4203439018696698
Generation 60: Best Fitness = 0.4203439018696698
Generation 61: Best Fitness = 0.4203439018696698
Generation 62: Best Fitness = 0.4203439018696698
Generation 63: Best Fitness = 0.4203439018696698
Generation 64: Best Fitness = 0.349341424777753
Generation 65: Best Fitness = 0.349341424777753
Generation 66: Best Fitness = 0.349341424777753
Generation 67: Best Fitness = 0.349341424777753
Generation 68: Best Fitness = 0.316030826335501
Generation 69: Best Fitness = 0.316030826335501
Generation 70: Best Fitness = 0.3127618615762042
Generation 71: Best Fitness = 0.29876729382262374
Generation 72: Best Fitness = 0.20016731901550422
Generation 73: Best Fitness = 0.20016731901550422
Generation 74: Best Fitness = 0.20016731901550422
Generation 75: Best Fitness = 0.20016731901550422
Generation 76: Best Fitness = 0.20016731901550422
Generation 77: Best Fitness = 0.20016731901550422
Generation 78: Best Fitness = 0.20016731901550422
Generation 79: Best Fitness = 0.1787609165963444
Generation 80: Best Fitness = 0.1787609165963444
Generation 81: Best Fitness = 0.1787609165963444
Generation 82: Best Fitness = 0.15578438434015338
Generation 83: Best Fitness = 0.15578438434015338
Generation 84: Best Fitness = 0.15578438434015338
Generation 85: Best Fitness = 0.15578438434015338
Generation 86: Best Fitness = 0.15578438434015338
Generation 87: Best Fitness = 0.15578438434015338
Generation 88: Best Fitness = 0.15578438434015338
Generation 89: Best Fitness = 0.14094610632797916
Generation 90: Best Fitness = 0.13714790188709136
Generation 91: Best Fitness = 0.13714790188709136
Generation 92: Best Fitness = 0.13714790188709136
Generation 93: Best Fitness = 0.13714790188709136
Generation 94: Best Fitness = 0.13714790188709136
Generation 95: Best Fitness = 0.13666995008436933
Generation 96: Best Fitness = 0.13666995008436933
Generation 97: Best Fitness = 0.13666995008436933
Generation 98: Best Fitness = 0.12081782531774324
Generation 99: Best Fitness = 0.12081782531774324
Generation 100: Best Fitness = 0.12081782531774324
差分进化算法是一种高效的全局优化算法,适用于单目标、多目标和约束优化问题。本文通过三个案例展示了差分进化算法在不同问题中的应用,并提供了完整的代码实现和优化曲线。在实际应用中,差分进化算法的性能依赖于参数设置,需要通过实验进行调整。