What is the Complexity Order of CVX for linear optimisation with determinant constraints?

I am solving the following problem with CVX.

minimise a linear function of \mathbf{x} where \mathbf{x} is a K\times 1 column vector (all components greater than or equal to zero) and also det(\mathbf{I}+x_1\mathbf{A}_1+\cdots x_K\mathbf{A}_K)> a certain floor, where \mathbf{A}_i are M\times M are given positive definite square matrices.

Can you give some rough idea about the complexity of the problem with respect both K and M? Is it square or cube with respect to them?

This is outside of the scope of my personal expertise, and really outside the scope of this forum.

Sorry. Since it involves the inner workings of CVX, this is the place I could turn to to know how CVX solves the problem.

Not really the inner workings of CVX, but rather the solvers that CVX uses.

Perhaps Erling from MOSEK will come along. I think he has posted before on complexity of various SDPs.

If you want a theoretical answer, then you just convert your problem to standard SDP i.e.

Next you look up the complexity of solving such an SDP. I guess that should be in the book of Boyd and Vanderberhe.
My guess is that the worst case complexity will be cubic in M. I mean checking positive definiteness will be cubic in M.

If you want to know what the practical running time is then answer would be along the line


adapted to the SDP case.

1 Like

Thanks a lot, but the problem is not semidefinite programming.

The constraint is on a scalar (a determinant actually), not on a matrix to be positive definite.

Given the other answers I thought it could be rewritten as a SDP. If that is not case, then I cannot help. Sorry.

I think Erling’s point is that the stated problem can be written as a standard SDP, even though it doesn’t take that form as you (OP) have stated it.

It can indeed be written as a standard SDP. Though as you have written it, it is not even convex. You need to take the $n$th root of the determinant to make it so.