Help with log det expression

(mkrck) #1

Hello, I am studying the following convex optimization problem and trying to translate it into CVX. I have no problems with the final two terms; it’s the logdet part that I can’t express.

(Below, A is a rectangular matrix and B is square and \lambda is a parameter; only X is unknown).

\min \limits_{X \succ 0} \left( \log \det ( A X^{-1} A^T + I) + \text{trace}(BX) + \lambda \|\text{vec}(X)\|_{\ell_1} \right)

I have searched these forums and the reference guide to no avail.
If this problem is not suitable for CVX and you think of an alternative, please share! Thank you.


Proof of convexity of logdet term: X \mapsto AX^{-1}A^T + I is convex and nonincreasing on the positive definite cone, so composing it with the concave function logdet(\cdot) is again convex.

(Mark L. Stone) #2

Whoops. Kind of read over the inverse on X.

See my formulation via Schur complement in my next post.

(mkrck) #3

But log det(positive definite) is concave, not convex

I’m afraid I don’t understand. For X positive definite, \log \det (X^{-1}) is convex . . .

Do you mean that it violates some CVX format since the outer \log \det (\cdot) function is concave and we’re searching for a minimum? If so, then I can use matrix identities to reformat the term as

- \log \det ( ( AX^{-1} A^T + I)^{-1} ) = - \log \det (( I - A(X + A^T A)^{-1} A^T ))

so that the outer function is convex.

(Mark L. Stone) #4

I have withdrawn my original answer because I misread the problem.

By Schur complement,
det(eye(n) - A*inv(X + A'*A)*A) * det(X + A'*A) = det([eye(n) A;A' X+A'*A]).
So -log_det(eye(n) - A*inv(X + A'*A)*A) = -log_det([eye(n) A;A' X+A'*A]) + log_det(X + A'*A)
Whoops, the 2nd term on the RHS won’t be accepted by CVX. I made a mistake in my first attempted solution by Schur complement which didn’t have the “illegal” term.

1 Like
(mkrck) #5

OK no problem Mark. Thank you for the advice – will give the Schur complement a try.

(Mark L. Stone) #6

Sorry again (not a good day). I made a mistake in my Schur complement formulation, and the corrected formulation would not be accepted by CVX.

(ADITYA GUPTA) #7

Hello, I am facing a same kind of problem while using CVX.
While applying a cvx concave expression (3x3 matrix) in log_det, it results in an error.
variable Z(3,3)
log_det(eye(3,3) + Z - power(Z,2))
I am ensuring that Z is positive definite as well as Hermitian.
Is there any other function that can be used apart from det_rootn to solve this, or else what can be other possible ways.
Attaching a snapshot.

@mcg @Mark_L_Stone

(Mark L. Stone) #8

Have you proven that your optimization problem is convex? Why isn't CVX accepting my model? READ THIS FIRST!

The argument of log_det is concave, but not affine, which it must be in order to be accepted by CVX.

help log_det

log_det Logarithm of the determinant of an SDP matrix.
For a square matrix X, log_det(X) returns
LOG(DET(X))
if X is symmetric (real) or Hermitian (complex) and positive semidefinite,
and -Inf otherwise.

When used in a CVX model, log_det(X) causes CVX's successive 
approximation method to be invoked, producing results exact to within
the tolerance of the solver. Therefore, whenever possible, the use of
DET_ROOTN(X) is to be preferred, because the latter choice can be
represented exactly in an SDP. For example, the objective
    MAXIMIZE(log_det(X))
can be (and should be) replaced with
    MAXIMIZE(DET_ROOTN(X))
in fact, log_det(X) is implemented simply as N*LOG(DET_ROOTN(X)).

Disciplined convex programming information:
    log_det is concave and nonmonotonic; therefore, when used in
    CVX specifications, its argument must be affine.
(ADITYA GUPTA) #9

Thank you for your quick response.
We have gone through the thread you have mentioned, we agree that log_det take only affine function, is there any other way to solve log_det with concave argument using cvx?

(Mark L. Stone) #10

Only if there is a reformulation which follows CVX’s rules.

But as the previously linked FAQ (which I encourage you to read carefully) makes clear, your first job is to prove convexity of the optimization problem you wish to solve. If it is proven to be convex, then you can worry about whether there are are reformulations which would be accepted by CVX. Note that of course, det_rootn also requires its argument to be affine.