Introduction¶
In pymoors, the algorithms are implemented as classes that are exposed on the Python side with a set of useful attributes. These attributes include the final population and the optimal or best set of individuals found during the optimization process.
For example, after running an algorithm like NSGA2, you can access:
- Final Population: The complete set of individuals from the last generation.
- Optimum Set: Typically, the best individuals (e.g., those with rank 0) that form the current approximation of the Pareto front.
This design abstracts away the complexities of the underlying Rust implementation and provides an intuitive, Pythonic interface for setting up, executing, and analyzing multi-objective optimization problems.
Mathematical Formulation of a Multi-Objective Optimization Problem with Constraints¶
Consider the following optimization problem:
$$ \begin{aligned} \min_{x_1, x_2} \quad & f_1(x_1,x_2) = x_1^2 + x_2^2 \\ \min_{x_1, x_2} \quad & f_2(x_1,x_2) = (x_1-1)^2 + x_2^2 \\ \text{subject to} \quad & x_1 + x_2 \leq 1, \\ & x_1 \geq 0,\quad x_2 \geq 0. \end{aligned} $$
Below is how you can formulate and solve this problem in pymoors:
import numpy as np
from pymoors import (
Nsga2,
RandomSamplingFloat,
GaussianMutation,
SimulatedBinaryCrossover,
CloseDuplicatesCleaner
)
from pymoors.typing import TwoDArray
# Define the fitness function
def fitness(genes: TwoDArray) -> TwoDArray:
x1 = genes[:, 0]
x2 = genes[:, 1]
# Objective 1: f1(x1,x2) = x1^2 + x2^2
f1 = x1**2 + x2**2
# Objective 2: f2(x1,x2) = (x1-1)^2 + x2**2
f2 = (x1 - 1)**2 + x2**2
return np.column_stack([f1, f2])
# Define the constraints function
def constraints(genes: TwoDArray) -> TwoDArray:
x1 = genes[:, 0]
x2 = genes[:, 1]
# Constraint 1: x1 + x2 <= 1
g1 = x1 + x2 - 1
# Convert to 2D array
return g1.reshape(-1, 1)
# Set up the NSGA2 algorithm with the above definitions
algorithm = Nsga2(
sampler=RandomSamplingFloat(min=0, max=1),
crossover=SimulatedBinaryCrossover(distribution_index=5),
mutation=GaussianMutation(gene_mutation_rate=0.1, sigma=0.01),
fitness_fn=fitness,
constraints_fn=constraints, # Pass the constraints function
duplicates_cleaner=CloseDuplicatesCleaner(epsilon=1e-8),
n_vars=2,
population_size=200,
n_offsprings=200,
num_iterations=200,
mutation_rate=0.1,
crossover_rate=0.9,
keep_infeasible=False,
lower_bound=0,
verbose = False
)
# Run the algorithm
algorithm.run()
population = algorithm.population
# Get the fitness
population.fitness[:10]
array([[9.99394079e-01, 9.33103923e-08], [1.79754182e-07, 9.99783590e-01], [7.50372090e-02, 5.27190285e-01], [2.80567643e-01, 2.21203093e-01], [2.45794571e-01, 2.54248519e-01], [9.48968574e-01, 6.82047091e-04], [8.13828732e-02, 5.10873063e-01], [9.68096834e-03, 8.12970387e-01], [8.97929178e-01, 2.76410736e-03], [2.90672542e-01, 2.12399375e-01]])
# Get the variables
population.genes[:10]
array([[9.99696993e-01, 3.86900355e-05], [1.08294861e-04, 4.09910240e-04], [2.73923462e-01, 1.77371818e-03], [5.29682275e-01, 2.08101070e-03], [4.95773026e-01, 1.91765718e-03], [9.74143263e-01, 3.67100179e-03], [2.85254905e-01, 3.53726804e-03], [9.83552908e-02, 2.68423465e-03], [9.47582535e-01, 4.06407890e-03], [5.39136584e-01, 2.07037314e-03]])
# Get the optimal solutions
best = population.best
best[0].genes
array([9.99696993e-01, 3.86900355e-05])
best[0].fitness
array([9.99394079e-01, 9.33103923e-08])