# Can I use CVX for minimizing sum of abs of negative eigenvalues of a parametric symmetric matrix

Assume K is a symmetric parametric matrix where K(x,y) = q(x) K0(x,y) q(y). K0 is a symmetric and non-positive definite matrix and q(x) is a function on x which is based on unknown parametrs ai s.
The objective function is the minimization of sum of abs of negative Eigenvalues of K, which is a convex function.
Can I use CVX for solving this problem for finding optimal values for unknown parameters of K matrix?

If K is merely not positive definite, and not necessarily negative semi-definite, i.e., it could be indefinite, why do you say that minimization of the sum of abs of negative Eigenvalues of K is a convex function, other than that you would like it to be? Perhaps you are aware of a result which I am not. I’m too lazy to try to come up with a counterexample right now.

Edit: Alright, that was stupid of me. In light of mcg’s answer below, I withdraw my question.

I think it might be. Suppose `Z` is your matrix. Then will this do it?

``````variable Z(n,n) symmetric
variable ZP(n,n) semidefinite
variable ZM(n,n) semidefinite
minimize(trace(ZM))
Z == ZP - ZM
% other constraints on Z
``````

I have no idea if any of your other constraints satisfy the DCP rules, however. You’ll have to determine that for yourself.

Sum of k smallest eigenvalues of a symmetric matrix is concave. It lead to convexity of sum of abs of negative eigenvalues of a symmetric matrix.

Actually, no, you can’t get there that way, because there’s no way to be sure that the number of negative eigenvalues is fixed at k. It is indeed convex, and can be represented how I showed you above.

In this problem Z= Q * K0* Q where Q is a diagonal matrix which its diagonal elements are sum( alphacvx.* exp(D)) where alphacvx is a vector variable with N element and D is 1N scalar vector, K0 is a NN scalar matrix, and the constraint of this problem is alphacvx(i) >= 0. Would you please tell if the following formulation of this problem using CVX is correct or not?

cvx_begin
variable alphacvx(N)
expression Q(N,N);
expression C(N);
variable Z(N,N) symmetric;
variable ZP1(N,N) semidefinite
variable ZN(N,N) semidefinite

for i = 1:N
C(i) == sum(alphacvx.*exp(D));
end
for i = 1:size(Q,1)
for j= 1:size(Q,2)
Z(i,j) == C(i)*C(j)*K0(i,j);
end
end
minimize(trace(ZN))
Z == ZP1 - ZN
alphacvx(1:N) >= 0
cvx_end

You seem to be using equality constraints were an assignment would be appropriate. So no, I do not think you are using CVX correctly. However I also don’t think your problem is convex, so I would refer you to the FAQ.

Why it is not convex? In previous posts you confirm its convexity.

I said only that the specific eigenvalue piece is convex. I also said:

I have no idea if any of your other constraints satisfy the DCP rules, however. You’ll have to determine that for yourself.

My code above assumes Z is a linear function of the variables. That assumption seems not to be satisfied. Again, consult the FAQ.