Tools
Multi Tools
- pymanopt.tools.multi.multitransp(A)[source]
Vectorized matrix transpose.
A
is assumed to be an array containingM
matrices, each of which has dimensionN x P
. That is,A
is anM x N x P
array. Multitransp then returns an array containing theM
matrix transposes of the matrices inA
, each of which will beP x N
.
- pymanopt.tools.multi.multisym(A)[source]
Vectorized matrix symmetrization.
Given an array
A
of matrices (represented as an array of shape(k, n, n)
), returns a version ofA
with each matrix symmetrized, i.e., every matrixA[i]
satisfiesA[i] == A[i].T
.
- pymanopt.tools.multi.multiskew(A)[source]
Vectorized matrix skew-symmetrization.
Similar to
multisym()
, but returns an array where each matrixA[i]
is skew-symmetric, i.e., the components ofA
satisfyA[i] == -A[i].T
.
Diagnostics
- pymanopt.tools.diagnostics.identify_linear_piece(x, y, window_length)[source]
Identify a segment of the curve (x, y) that appears to be linear.
This function attempts to identify a contiguous segment of the curve defined by the vectors x and y that appears to be linear. A line is fit through the data over all windows of length window_length and the best fit is retained. The output specifies the range of indices such that x(segment) is the portion over which (x, y) is the most linear and the output poly specifies a first order polynomial that best fits (x, y) over that segment (highest degree coefficients first).
- pymanopt.tools.diagnostics.check_directional_derivative(problem, x=None, d=None, *, use_quadratic_model=False)[source]
Checks the consistency of the cost function and directional derivatives.
check_directional_derivative performs a numerical test to check that the directional derivatives defined in the problem agree up to first or second order with the cost function at some point x, along some direction d. The test is based on a truncated Taylor series. Both x and d are optional and will be sampled at random if omitted.
- pymanopt.tools.diagnostics.check_gradient(problem, x=None, d=None)[source]
Checks the consistency of the cost function and the gradient.
check_gradient performs a numerical test to check that the gradient defined in the problem agrees up to first order with the cost function at some point x, along some direction d. The test is based on a truncated Taylor series.
It is also tested that the gradient is indeed a tangent vector.
Both x and d are optional and will be sampled at random if omitted.
- pymanopt.tools.diagnostics.check_hessian(problem, point=None, tangent_vector=None)[source]
Checks the consistency of the cost function and the Hessian.
The function performs a numerical test to check that the Hessian defined in the problem agrees up to second order with the cost function at some point x, along some direction d. The test is based on a truncated Taylor series.
It is also tested that the result of applying the Hessian along that direction is indeed a tangent vector, and that the Hessian operator is linear and symmetric w.r.t. the Riemannian metric.
Both x and d are optional and will be sampled at random if omitted.
Testing
Tools for testing numerical correctness in Pymanopt.
Note
The functions rgrad()
, euclidean_to_riemannian_gradient()
,
ehess()
and euclidean_to_riemannian_hessian()
will only be
correct if the manifold is a submanifold of Euclidean space, that is if the
projection is an orthogonal projection onto the tangent space.
- pymanopt.tools.testing.riemannian_gradient(cost, projector)[source]
Generates the Riemannian gradient of a cost function.
- pymanopt.tools.testing.euclidean_to_riemannian_gradient(projector)[source]
Generates an euclidean_to_riemannian_gradient function.
- pymanopt.tools.testing.euclidean_to_riemannian_hessian(projector)[source]
Generates an euclidean_to_riemannian_hessian function.
Specifically,
euclidean_to_riemannian_hessian(proj)(point, euclidean_gradient, euclidean_hessian, tangent_vector)
converts the Euclidean Hessian-vector producteuclidean_hessian
at a pointpoint
to a Riemannian Hessian-vector product, i.e., the directional derivative of the gradient in the tangent directiontangent_vector
. Similar toriemannian_hessian()
, this is not efficient as it computes the Jacobian explicitly.