Optimization
Optimizers
Optimizer
- class pymanopt.optimizers.optimizer.OptimizerResult(point: Any, cost: float, iterations: int, stopping_criterion: str, time: float, cost_evaluations: Union[int, NoneType] = None, step_size: Union[float, NoneType] = None, gradient_norm: Union[float, NoneType] = None, log: Union[Dict, NoneType] = None)[source]
Bases:
object
- Parameters
point (Any) –
cost (float) –
iterations (int) –
stopping_criterion (str) –
time (float) –
cost_evaluations (Optional[int]) –
step_size (Optional[float]) –
gradient_norm (Optional[float]) –
log (Optional[Dict]) –
- Return type
None
- point: Any
- cost: float
- iterations: int
- stopping_criterion: str
- time: float
- cost_evaluations: Optional[int] = None
- step_size: Optional[float] = None
- gradient_norm: Optional[float] = None
- log: Optional[Dict] = None
- class pymanopt.optimizers.optimizer.Optimizer(max_time=1000, max_iterations=1000, min_gradient_norm=1e-06, min_step_size=1e-10, max_cost_evaluations=5000, verbosity=2, log_verbosity=0)[source]
Bases:
object
Abstract base class for Pymanopt optimizers.
- Parameters
max_time (float) – Upper bound on the run time of an optimizer in seconds.
max_iterations (int) – The maximum number of iterations to perform.
min_gradient_norm (float) – Termination threshold for the norm of the (Riemannian) gradient.
min_step_size (float) – Termination threshold for the line search step size.
max_cost_evaluations (int) – Maximum number of allowed cost function evaluations.
verbosity (int) – Level of information printed by the optimizer while it operates: 0 is silent, 2 is most verbose.
log_verbosity (int) – Level of information logged by the optimizer while it operates: 0 logs nothing, 1 logs information for each iteration.
- abstract run(problem, *, initial_point=None, **kwargs)[source]
Run an optimizer on a given optimization problem.
- Parameters
problem – Pymanopt problem class instance exposing the cost function and the manifold to optimize over.
initial_point – Initial point on the manifold. If no value is provided then a starting point will be randomly generated.
*args – Potential optimizer-specific positional arguments.
**kwargs – Potential optimizer-specific keyword arguments.
- Returns
The optimization result.
- Return type
Conjugate Gradients
- class pymanopt.optimizers.conjugate_gradient.ConjugateGradient(beta_rule='HestenesStiefel', orth_value=inf, line_searcher=None, *args, **kwargs)[source]
Bases:
pymanopt.optimizers.optimizer.Optimizer
Riemannian conjugate gradient method.
Perform optimization using nonlinear conjugate gradient method with line_searcher. This method first computes the gradient of the cost function, and then optimizes by moving in a direction that is conjugate to all previous search directions.
- Parameters
beta_rule (str) – Conjugate gradient beta rule used to construct the new search direction. Valid choices are
{"FletcherReeves", "PolakRibiere", "HestenesStiefel", "HagerZhang"}
.orth_value – Parameter for Powell’s restart strategy. An infinite value disables this strategy.
line_searcher – The line search method.
Notes
See [HZ2006] for details about Powell’s restart strategy.
- run(problem, *, initial_point=None, reuse_line_searcher=False)[source]
Run CG method.
- Parameters
problem – Pymanopt problem class instance exposing the cost function and the manifold to optimize over. The class must either
initial_point – Initial point on the manifold. If no value is provided then a starting point will be randomly generated.
reuse_line_searcher – Whether to reuse the previous line searcher. Allows to use information from a previous call to
run()
.
- Returns
Local minimum of the cost function, or the most recent iterate if algorithm terminated before convergence.
- Return type
Nelder-Mead Algorithm
- pymanopt.optimizers.nelder_mead.compute_centroid(manifold, points)[source]
Compute the centroid of points on the manifold as Karcher mean.
- class pymanopt.optimizers.nelder_mead.NelderMead(max_cost_evaluations=None, max_iterations=None, reflection=1, expansion=2, contraction=0.5, *args, **kwargs)[source]
Bases:
pymanopt.optimizers.optimizer.Optimizer
Nelder-Mead alglorithm.
Perform optimization using the derivative-free Nelder-Mead minimization algorithm.
- Parameters
max_cost_evaluations – Maximum number of allowed cost function evaluations.
max_iterations – Maximum number of allowed iterations.
reflection – Determines how far to reflect away from the worst vertex: stretched (reflection > 1), compressed (0 < reflection < 1), or exact (reflection = 1).
expansion – Factor by which to expand the reflected simplex.
contraction – Factor by which to contract the reflected simplex.
- run(problem, *, initial_point=None)[source]
Run Nelder-Mead algorithm.
- Parameters
problem – Pymanopt problem class instance exposing the cost function and the manifold to optimize over.
initial_point – Initial point on the manifold. If no value is provided then a starting point will be randomly generated.
- Returns
Local minimum of the cost function, or the most recent iterate if algorithm terminated before convergence.
- Return type
Particle Swarms
- class pymanopt.optimizers.particle_swarm.ParticleSwarm(max_cost_evaluations=None, max_iterations=None, population_size=None, nostalgia=1.4, social=1.4, *args, **kwargs)[source]
Bases:
pymanopt.optimizers.optimizer.Optimizer
Particle swarm optimization (PSO) method.
Perform optimization using the derivative-free particle swarm optimization algorithm.
- Parameters
max_cost_evaluations – Maximum number of allowed cost evaluations.
max_iterations – Maximum number of allowed iterations.
population_size – Size of the considered swarm population.
nostalgia – Quantifies performance relative to past performances.
social – Quantifies performance relative to neighbors.
- run(problem, *, initial_point=None)[source]
Run PSO algorithm.
- Parameters
problem – Pymanopt problem class instance exposing the cost function and the manifold to optimize over.
initial_point – Initial point on the manifold. If no value is provided then a starting point will be randomly generated.
- Returns
Local minimum of the cost function, or the most recent iterate if algorithm terminated before convergence.
- Return type
Steepest Descent
- class pymanopt.optimizers.steepest_descent.SteepestDescent(line_searcher=None, *args, **kwargs)[source]
Bases:
pymanopt.optimizers.optimizer.Optimizer
Riemannian steepest descent algorithm.
Perform optimization using gradient descent with line search. This method first computes the gradient of the objective, and then optimizes by moving in the direction of steepest descent (which is the opposite direction to the gradient).
- Parameters
line_searcher – The line search method.
- run(problem, *, initial_point=None, reuse_line_searcher=False)[source]
Run steepest descent algorithm.
- Parameters
problem – Pymanopt problem class instance exposing the cost function and the manifold to optimize over. The class must either
initial_point – Initial point on the manifold. If no value is provided then a starting point will be randomly generated.
reuse_line_searcher – Whether to reuse the previous line searcher. Allows to use information from a previous call to
solve()
.
- Returns
Local minimum of the cost function, or the most recent iterate if algorithm terminated before convergence.
- Return type
Riemannian Trust Regions Algorithm
- class pymanopt.optimizers.trust_regions.TrustRegions(miniter=3, kappa=0.1, theta=1.0, rho_prime=0.1, use_rand=False, rho_regularization=1000.0, *args, **kwargs)[source]
Bases:
pymanopt.optimizers.optimizer.Optimizer
- NEGATIVE_CURVATURE = 0
- EXCEEDED_TR = 1
- REACHED_TARGET_LINEAR = 2
- REACHED_TARGET_SUPERLINEAR = 3
- MAX_INNER_ITER = 4
- MODEL_INCREASED = 5
- TCG_STOP_REASONS = {0: 'negative curvature', 1: 'exceeded trust region', 2: 'reached target residual-kappa (linear)', 3: 'reached target residual-theta (superlinear)', 4: 'maximum inner iterations', 5: 'model increased'}
- run(problem, *, initial_point=None, mininner=1, maxinner=None, Delta_bar=None, Delta0=None)[source]
Run an optimizer on a given optimization problem.
- Parameters
problem – Pymanopt problem class instance exposing the cost function and the manifold to optimize over.
initial_point – Initial point on the manifold. If no value is provided then a starting point will be randomly generated.
*args – Potential optimizer-specific positional arguments.
**kwargs – Potential optimizer-specific keyword arguments.
- Returns
The optimization result.
- Return type
Line-Search Methods
- class pymanopt.optimizers.line_search.BackTrackingLineSearcher(contraction_factor=0.5, optimism=2, sufficient_decrease=0.0001, max_iterations=25, initial_step_size=1)[source]
Bases:
object
Back-tracking line-search algorithm.
- search(objective, manifold, x, d, f0, df0)[source]
Function to perform backtracking line search.
- Parameters
objective – Objective function to optimize.
manifold – The manifold to optimize over.
x – Starting point on the manifold.
d – Tangent vector at
x
, i.e., a descent direction.df0 – Directional derivative at
x
alongd
.
- Returns
A tuple
(step_size, newx)
wherestep_size
is the norm of the vector retracted to reach the suggested iteratenewx
fromx
.