# Tools

## Multi Tools

pymanopt.tools.multi.multitransp(A)[source]

Vectorized matrix transpose.

A is assumed to be an array containing M matrices, each of which has dimension N x P. That is, A is an M x N x P array. Multitransp then returns an array containing the M matrix transposes of the matrices in A, each of which will be P x N.

pymanopt.tools.multi.multihconj(A)[source]

Vectorized matrix conjugate transpose.

pymanopt.tools.multi.multisym(A)[source]

Vectorized matrix symmetrization.

Given an array A of matrices (represented as an array of shape (k, n, n)), returns a version of A with each matrix symmetrized, i.e., every matrix A[i] satisfies A[i] == A[i].T.

pymanopt.tools.multi.multiskew(A)[source]

Vectorized matrix skew-symmetrization.

Similar to multisym(), but returns an array where each matrix A[i] is skew-symmetric, i.e., the components of A satisfy A[i] == -A[i].T.

pymanopt.tools.multi.multieye(k, n)[source]

Array of k n x n identity matrices.

pymanopt.tools.multi.multilogm(A, *, positive_definite=False)[source]

Vectorized matrix logarithm.

pymanopt.tools.multi.multiexpm(A, *, symmetric=False)[source]

Vectorized matrix exponential.

pymanopt.tools.multi.multiqr(A)[source]

Vectorized QR decomposition.

## Diagnostics

pymanopt.tools.diagnostics.identify_linear_piece(x, y, window_length)[source]

Identify a segment of the curve (x, y) that appears to be linear.

This function attempts to identify a contiguous segment of the curve defined by the vectors x and y that appears to be linear. A line is fit through the data over all windows of length window_length and the best fit is retained. The output specifies the range of indices such that x(segment) is the portion over which (x, y) is the most linear and the output poly specifies a first order polynomial that best fits (x, y) over that segment (highest degree coefficients first).

Checks the consistency of the cost function and directional derivatives.

check_directional_derivative performs a numerical test to check that the directional derivatives defined in the problem agree up to first or second order with the cost function at some point x, along some direction d. The test is based on a truncated Taylor series. Both x and d are optional and will be sampled at random if omitted.

Checks the consistency of the cost function and the gradient.

check_gradient performs a numerical test to check that the gradient defined in the problem agrees up to first order with the cost function at some point x, along some direction d. The test is based on a truncated Taylor series.

It is also tested that the gradient is indeed a tangent vector.

Both x and d are optional and will be sampled at random if omitted.

pymanopt.tools.diagnostics.check_retraction(manifold, point=None, tangent_vector=None)[source]

Check order of agreement between a retraction and the exponential.

## Testing

Tools for testing numerical correctness in Pymanopt.

Note

The functions rgrad(), euclidean_to_riemannian_gradient(), ehess() and euclidean_to_riemannian_hessian() will only be correct if the manifold is a submanifold of Euclidean space, that is if the projection is an orthogonal projection onto the tangent space.

Specifically, euclidean_to_riemannian_hessian(proj)(point, euclidean_gradient, euclidean_hessian, tangent_vector) converts the Euclidean Hessian-vector product euclidean_hessian at a point point to a Riemannian Hessian-vector product, i.e., the directional derivative of the gradient in the tangent direction tangent_vector. Similar to riemannian_hessian(), this is not efficient as it computes the Jacobian explicitly.