Is there any way to implement the perspective function of log det function (x\log|I+\frac Y x|, x \geq 0,Y \in S^+) in cvx?@mcg
@jmiao I believe this can be done using CVXQUAD.https://github.com/hfawzi/cvxquad/9 , which supports quantum_rel_entr(X,Y)
defined as trace(X*(logm(X)-logm(Y))) .
-quantum_rel_entr(x*eye(n), x*eye(n) + Y) = x*log(det(eye(n) + Y/x))
, which is the matrix (quantum) generalization of @mcg’s scalar formula shown in Perspective function .
Full disclosure: I had never used CVXQUAD. @awinick, who tried it, wrote in Adding Quantum Relative Entropy to CVX “I found that both the run time and memory usage meant that I could only consider trivial problems of interest.” But perhaps your problems of interest, if you’re still interested a year later, are smaller than his.
Update: I’ve now tried some optimization with quantum_rel_entr
under CVX 2.1 . As matrix dimension increases, it can take a long time to process even an expression having qunatum_rel_entr, let alone the solver solution time. And you can easily run out of memory.
I encountered an error in kron which I have not tried to diagnose when using it under CVX 3.0beta.
Here is how to formulate x\log|I+ xY^{-1}|, x \geq 0,Y \in S^{++}, which is jointly convex in x and Y.
x*log(det(eye(n) + x*inv(Y))
)
can be formulated as
quantum_rel_entr(x*eye(n)+Y,Y) + quantum_rel_entr(Y,x*eye(n)+Y)
which turns out to be the quantum (matrix) analog of @Michal_Adamaszek 's scalar formulation
x*log(1+x/y) = rel_entr(x+y,y) + rel_entr(y,x+y)
from Writing x*log(1+x/y)
Here is how to formulate x\log|I+ Y(Y+xI)^{-1}|, x \geq 0,Y \in S^{+}, Y+xI \in S^{++}
or equivalently, x\log|I+ (Y+xI)^{-1}Y|, x \geq 0,Y \in S^{+}, Y+xI \in S^{++}, which are jointly concave in x and Y.
Equivalence of the two problems holds due to Y and (Y+xI)^{-1} commuting due to being diagnoliizable and having the same eigenvectors.
x*log(det(eye(n) + Y*inv(x*eye(n)+Y)))
and
x*log(det(eye(n) + inv(x*eye(n)+Y)*Y))
can both be formulated as
-2*quantum_rel_entr(x*eye(n)+Y,x*eye(n)+2*Y) - quantum_rel_entr(x*eye(n)+2*Y,x*eye(n)+Y)
which turns out to be the quantum (matrix) analog of the scalar formulation
x*log(1+y/(x+y)) = -2*rel_entr(x+y,x+2*y) - rel_entr(x+2*y,x+y)
from Xlog( 1+ Y/(X+Y) ): DCP rules, and build-in functions in cvx
Here is how to formulate x\log|I+ x(Y+xI)^{-1}|, x \geq 0,Y \in S^{+}, Y+xI \in S^{++}, which is jointly convex in x and Y.
x*log_det(eye(n)+x*eye(n)*inv(Y+x*eye(n)))
can be formulated as
quantum_rel_entr(x*eye(n)+Y,2*x*eye(n)+Y) + quantum_rel_entr(2*x*eye(n)+Y,x*eye(n)+Y)
which turns out to be the quantum (matrix) analog of the scalar formulation
x*log(1+x/(x+y)) = rel_entr(x+y,2*x+y) + rel_entr(2*x+y,x+y)
from Here's how to handle x*log(1+x/(x+y)) for x >= 0, y >= 0