Problem

The Pymanopt problem class.

class pymanopt.core.problem.Problem(manifold, cost, *, euclidean_gradient=None, riemannian_gradient=None, euclidean_hessian=None, riemannian_hessian=None, preconditioner=None)[source]

Bases: object

Problem class to define a Riemannian optimization problem.

Parameters
  • manifold (pymanopt.manifolds.manifold.Manifold) – Manifold to optimize over.

  • cost (pymanopt.autodiff.Function) – A callable decorated with a decorator from pymanopt.functions which takes a point on a manifold and returns a real scalar. If any decorator other than pymanopt.function.numpy() is used, the gradient and Hessian functions are generated automatically if needed and no {euclidean,riemannian}_gradient or {euclidean,riemannian}_hessian arguments are provided.

  • euclidean_gradient (Optional[pymanopt.autodiff.Function]) – The Euclidean gradient, i.e., the gradient of the cost function in the typical sense in the ambient space. The returned value need not belong to the tangent space of manifold.

  • riemannian_gradient (Optional[pymanopt.autodiff.Function]) – The Riemannian gradient. For embedded submanifolds this is simply the projection of euclidean_gradient on the tangent space of manifold. In most cases this need not be provided and the Riemannian gradient is instead computed internally. If provided, the function needs to return a vector in the tangent space of manifold.

  • euclidean_hessian (Optional[pymanopt.autodiff.Function]) – The Euclidean Hessian, i.e., the directional derivative of euclidean_gradient in the direction of a tangent vector.

  • riemannian_hessian (Optional[pymanopt.autodiff.Function]) – The Riemannian Hessian, i.e., the directional derivative of riemannian_gradient in the direction of a tangent vector. As with riemannian_gradient this usually need not be provided explicitly.

  • preconditioner (Optional[Callable]) –

property cost
property euclidean_gradient
property riemannian_gradient
property euclidean_hessian
property riemannian_hessian