Skip to content

Basic Optimizer

ropt.workflow._basic_optimizer

This module defines a basic optimization object.

ropt.workflow.BasicOptimizer

A class for executing single optimization runs.

The BasicOptimizer is designed to simplify the process of setting up and executing optimization workflows that consist primarily of a single optimization run.

This class provides a user-friendly interface for common optimization operations, including:

  • Initiating a Single Optimization: Easily start an optimization process with a provided configuration and evaluator.
  • Observing Optimization Events: Register observer functions to monitor and react to various events that occur during the optimization, such as the start of an evaluation or the availability of new results.
  • Abort Conditions: Define a callback function that can be used to check for abort conditions during the optimization.
  • Result Reporting: Define a callback function that will be called whenever new results become available.
  • Accessing Results: After the optimization is complete, the optimal results, corresponding variables, and the optimization's exit code are readily accessible.

By encapsulating the core elements of an optimization run, the BasicOptimizer reduces the boilerplate code required for simple optimization tasks, allowing users to focus on defining the optimization problem and analyzing the results.

The following example demonstrates how to find the optimum of the Rosenbrock function using a BasicOptimizer object, combining it with a tracker to store the best result.

Example
import numpy as np
from numpy.typing import NDArray

from ropt.evaluator import EvaluatorContext, EvaluatorResult
from ropt.workflow import BasicOptimizer

DIM = 5
CONFIG = {
    "variables": {
        "variable_count": DIM,
        "perturbation_magnitudes": 1e-6,
    },
}
initial_values = 2 * np.arange(DIM) / DIM + 0.5


def rosenbrock(variables: NDArray[np.float64], _: EvaluatorContext) -> EvaluatorResult:
    objectives = np.zeros((variables.shape[0], 1), dtype=np.float64)
    for v_idx in range(variables.shape[0]):
        for d_idx in range(DIM - 1):
            x, y = variables[v_idx, d_idx : d_idx + 2]
            objectives[v_idx, 0] += (1.0 - x) ** 2 + 100 * (y - x * x) ** 2
    return EvaluatorResult(objectives=objectives)


optimizer = BasicOptimizer(CONFIG, rosenbrock)
optimizer.run(initial_values)

print(f"Optimal variables: {optimizer.results.evaluations.variables}")
print(f"Optimal objective: {optimizer.results.functions.target_objective}")
Customization

The optimization workflow executed by BasicOptimizer can be tailored by adding default event handlers. This allows for custom processing of events emitted by the default optimization workflow, without replacing the workflow itself. This is useful for tasks like custom logging or data processing.

Default event handlers can be specified using a JSON configuration file is found at <prefix>/share/ropt/options.json, where <prefix> is the Python installation prefix or a system-wide data prefix.1. This JSON file should contain a basic_optimizer item, containing an event_handlers item that provides a list of strings of the form "module_name.handler_name". The module_name denotes a module containing an event handler class with the name module_name.

Example shared/ropt/options.json:

{
    "basic_optimizer": {
        "event_handlers": ["mylogger.Logger"]
    }
}

  1. The exact path to Python installation prefix, or the system's data prefix can be found using the Python sysconfig module:

    from sysconfig import get_paths
    print(get_paths()["data"])
    
     

results property

results: FunctionResults | None

Return the optimal result found during the optimization.

This property provides access to the best FunctionResults object discovered during the optimization process. It encapsulates the objective function value, constraint values, and other relevant information about the optimal solution.

Returns:

Type Description
FunctionResults | None

The optimal result.

__init__

__init__(
    config: dict[str, Any],
    evaluator: EvaluatorCallback | Evaluator,
    *,
    constraint_tolerance: float = 1e-10,
) -> None

Initialize a BasicOptimizer object.

This constructor sets up the necessary components for a single optimization run. It requires an optimization configuration, an evaluator, and optional domain transform, which together define the optimization problem.

The constraint_tolerance is used to check any constraints, if a constraint value is within this tolerance, it is considered satisfied.

Parameters:

Name Type Description Default
config dict[str, Any]

The configuration for the optimization.

required
evaluator EvaluatorCallback | Evaluator

The evaluator object.

required
constraint_tolerance float

The constraint violation tolerance.

1e-10

run

run(initial_values: ArrayLike) -> ExitCode

Run the optimization process.

This method initiates and executes the optimization workflow defined by the BasicOptimizer object. It manages the optimization, result handling, and event processing. After the optimization is complete, the optimal results, variables, and exit code can be accessed via the corresponding properties.

Returns:

Type Description
ExitCode

The exit code returned by the optimization workflow.

set_abort_callback

set_abort_callback(callback: Callable[[], bool]) -> None

Set a callback to check for abort conditions.

The provided callback function will be invoked repeatedly during the optimization process. If the callback returns True, the optimization will be aborted, and the BasicOptimizer will exit with an ExitCode.USER_ABORT.

The callback function should have no arguments and return a boolean value.

Parameters:

Name Type Description Default
callback Callable[[], bool]

The callable to check for abort conditions.

required

set_results_callback

set_results_callback(callback: Callable[..., None]) -> None

Set a callback to report new results.

The provided callback function will be invoked whenever new results become available during the optimization process. This allows for real-time monitoring and analysis of the optimization's progress.

The required signature of the callback function should be:

def callback(results: tuple[FunctionResults, ...]) -> None:
    ...

Parameters:

Name Type Description Default
callback Callable[..., None]

The callable that will be invoked to report new results.

required