Elitism in genetic algorithm

Elitism in genetic algorithms is used to preserve the best individuals (solutions) from one generation to the next. This ensures that the fittest individuals can survive unchanged in subsequent generations, potentially preventing the loss of valuable genetic material through random selection or crossover.

The process in a genetic algorithm involves creating a new generation of individuals through selection, crossover, and mutation operations.

Without elitism, the entire population might undergo significant changes in each generation, potentially losing highly fit individuals discovered earlier in the optimization process. Elitism typically involves selecting a certain number (usually a small fraction) of the best individuals from the current population and directly transferring them to the next generation without any modification.

These individuals, often referred to as “elites,” serve as a form of genetic memory, ensuring that the best solutions found so far persist across generations. In the diagram below, Individual A (shown in green) was the best performer in the first generation. A copy of this individual is then transferred to the second generation.

Transfer of elite performers from one generation to the next
Transfer of elite performers from one generation to the next

Let's discuss the benefits, drawbacks, coding example, and practical uses of elitism in genetic algorithms.

Benefits of elitism

The key benefits of elitism in genetic algorithms include:

  • Preserving valuable genetic material: Elitism helps maintain the best solutions discovered during optimization, preventing their loss due to random variations.

  • Faster convergence: Elitism accelerates convergence by ensuring that the fittest individuals persist across generations, guiding the genetic algorithm more directly toward optimal or near-optimal solutions. Preserving superior genetic material reduces the time and iterations required for the algorithm to converge toward the best possible solutions.

  • Stability: Elitism can improve the stability of the optimization process by providing a consistent reference point from which new solutions are generated.

Drawbacks of elitism

While there are many benefits, it’s also essential to know a couple of drawbacks:

  • Premature convergence: Premature convergence may occur when an excessive number of elites are preserved in each generation, causing the genetic algorithm to prematurely settle into a local optimum. This scenario restricts the exploration of alternative regions within the search space, potentially hindering the discovery of globally optimal solutions.

  • Reduced diversity: Elitism may decrease population diversity, limiting exploration of the search space and hindering the discovery of novel solutions.

Code example

Let’s explore the example of a genetic algorithm with elitism in python. In this example, we’ll use a basic optimization problem of maximizing a simple mathematical function.

import numpy as np
# Define the fitness function
def fitness_function(x):
# Example fitness function: f(x) = -x^2 + 4x
return -x**2 + 4*x
# Define the genetic algorithm function
def genetic_algorithm(population_size, num_generations, elite_percentage, mutation_rate):
# Initialize population with random solutions
population = np.random.uniform(-10, 10, size=(population_size,))
# Main loop for generations
for generation in range(num_generations):
# Evaluate fitness of each individual in the population
fitness_scores = [fitness_function(individual) for individual in population]
# Select elites
num_elites = int(population_size * elite_percentage)
elite_indices = np.argsort(fitness_scores)[-num_elites:]
elites = population[elite_indices]
# Generate new population using crossover and mutation
new_population = []
while len(new_population) < population_size:
# Crossover: Randomly select two parents from elites
parent1, parent2 = np.random.choice(elites, size=2, replace=False)
# Mutation: Randomly perturb the offspring
offspring = np.mean([parent1, parent2]) + mutation_rate * np.random.normal()
new_population.append(offspring)
# Update population
population = np.array(new_population)
# Display best solution in each generation
best_solution = elites[-1]
print(f"Generation {generation+1}: Best solution = {best_solution}, Fitness = {fitness_function(best_solution)}")
# Return the best solution found
return elites[-1]
# Set parameters
population_size = 50
num_generations = 10
elite_percentage = 0.1
mutation_rate = 0.1
# Run the genetic algorithm
best_solution = genetic_algorithm(population_size, num_generations, elite_percentage, mutation_rate)
print("Final best solution:", best_solution)
print("Final fitness:", fitness_function(best_solution))

Code explanation

  • Lines 1–3: We import the numpy library and alias it as np. This library is used for generating random numbers and performing numerical operations.

  • Lines 4–7: We define the fitness_function(x), which represents the optimization goal. In this case, it returns the result of the expression -x² + 4x.

  • Lines 8–9: We define the main function genetic_algorithm() that takes four parameters: population_size, num_generations, elite_percentage, and mutation_rate.

  • Lines 10–12: We initialize the population by generating random values between -10 and 10 using np.random.uniform(). Each value represents a candidate solution.

  • Lines 13–14: We start a loop that runs once for each generation.

  • Lines 15–17: We evaluate the fitness of each individual using the fitness_function(). The result is stored in a list called fitness_scores.

  • Lines 18–20: We select the top-performing individuals (elites). We calculate how many elites to keep based on elite_percentage, get their indices using np.argsort(), and extract them from the population.

  • Lines 23–28: We create a new population. In each iteration, we randomly select two parents from the elites, generate an offspring by averaging their values, apply mutation by adding noise scaled by mutation_rate, and add the result to the new population.

  • Lines 30–32: We replace the current population with the new one for the next generation.

  • Lines 33–34: We select the best individual from the current generation and print its value and fitness score.

  • Line 37: We return the best solution found from the final generation.

  • Lines 40–43: We define values for population_size, num_generations, elite_percentage, and mutation_rate.

  • Lines 44–46: We run the genetic algorithm using the defined parameters and store the result in best_solution.

  • Lines 47–48: We print the final best solution and its fitness.

Practical applications

Practical applications of elitism in genetic algorithms:

  • Genetic algorithms with elitism optimize engineering designs.

  • They aid in financial portfolio management decisions.

  • Elitism enhances bioinformatics tasks like protein folding prediction.

  • Machine learning benefits from elitism for feature selection and parameter optimization.

  • Robotics utilizes elitism in path planning and control.

  • Elitism improves manufacturing processes like production scheduling.

Conclusion

Elitism in genetic algorithms effectively preserves superior solutions, expedites convergence, and enhances stability in optimization tasks. Despite potential drawbacks like premature convergence, it remains a valuable tool across diverse domains, from engineering to finance and bioinformatics, enabling efficient problem-solving and innovation.

Free Resources

Copyright ©2025 Educative, Inc. All rights reserved