Analogous to what was done with k-opt, we should add functionality to optimize a user-specified function over the set of doubly-stochastic matrices. (Doubly stochastic matrices are nonnegative matrices that have row-sum and column-sum equal to 1.)
- scipy.opt could be used for the (local) optimization starting from a user-specified starting point. This is a classic problem for the
slsqp algorithm but the trust-constr is also appropriate. Which algorithm could be a user option. Or just choose one; this is intended as a useful utility and should not be confused with a cutting-edge treatment for this problem.
- The default starting point could be constructed as (1) take a random orthogonal matrix; (2) square all the elements).
- as a user option, the function could return the permutation matrix that is closest to the doubly-stochastic matrix that is found.
This sort of problem showed up in @wilhadams work. For an objective function of the form in one-sided permutation Procrustes, it amounts to a (slow, iterative) alternative to the Hungarian algorithm for the assignment problem.
Note that in Procrustes problems of a quite general (tensor-ish) sense, it is usually important to maximize (or minimize the negative of) the cosine/overlap between the optimizee and the target, as direct minimization of the error doubles the degree of the optimization.
Analogous to what was done with k-opt, we should add functionality to optimize a user-specified function over the set of doubly-stochastic matrices. (Doubly stochastic matrices are nonnegative matrices that have row-sum and column-sum equal to 1.)
slsqpalgorithm but thetrust-constris also appropriate. Which algorithm could be a user option. Or just choose one; this is intended as a useful utility and should not be confused with a cutting-edge treatment for this problem.This sort of problem showed up in @wilhadams work. For an objective function of the form in one-sided permutation Procrustes, it amounts to a (slow, iterative) alternative to the Hungarian algorithm for the assignment problem.
Note that in Procrustes problems of a quite general (tensor-ish) sense, it is usually important to maximize (or minimize the negative of) the cosine/overlap between the optimizee and the target, as direct minimization of the error doubles the degree of the optimization.