Hi, I’m implementing an optimization problem but getting stuck in constraint including inverse matrix as below

*** ln | eye(4) + **V***(**V**^H)*(**E**^(-1)) | <= C

where (^H) is the Hermitian transpose. C is the constant. V and E are defined as

- variable
**V**(4,1) complex
- variable
**E**(4,4)

I’ve found all the way to implement the inverse matrix E in CVX but got no result. Could anyone help me how to implement the inequality as above?

Thanks so much.

1 Like

I don’t know what optimization problem this is a part of, but it looks like you will have more difficulties than just a matrix inverse.

V*V’ is non-convex, and is not valid in CVX. As is multiplying that by another CVX variable (E or even inverse of E). And then are you taking a determinnant, and making it <= something? This seems like non-convexities piled on top of each other.

Until you show otherwise (or clarify or fix up the problem being solved), this will be presumed to be non-convex.

Things you can do with an inverse matrix variable n CVX include use of matrix_frac, and use of Schur complement formulation to remove the inverse. Neither of these seem to fit your situation.

Edit: this post is superseded by the following posts.

3 Likes

So many thanks for your reply, Mark. I’ve solved that problem. Since the inner one is inside the logdet(matrix) so it will equal to log(scalar) due to the determinant characteristic. I’ve transformed it like this

ln | eye(4) + V*(V^H)*(E^(-1)) | = ln(1 + (V^H)*(E^(-1))*V).*

Thus, (V^H)(E^(-1))*V can be applied by matrix_frac as you mentioned. One more problem is log(… matrix_frac…) could not run. So I have transformed the inequality again by taking exponential for both two sides as follows

1 + matrix_frac(V,E) <= exp ( C )

and then successfully run. Special thanks.

1 Like

Sorry, I missed the application of Sylvester’s determinant identity to reformulatie

`det(eye(4)+V*V'*inv(E))`

to

`1+V'*inv(E)*V`

1 Like