Prescribing tolerance for positivedefiniteness


(Lucas) #1

I have specified a problem whose objetive function comprises a norm error cost function to be minimized subject to LMI constraints.

minimize norm (error)
subject to LMI ==semidefinite

CVX does solve the problem with an excelent fit to the data but there seems to be some very small negative (something of the magnitude -1e^-7) eigenvalues which violate the constraints.

Is there a way to require cvx not to allow for negative eigenvalues? So to speak could I raise the tolerance to some small but positive eigenvalue, say +1e^-7?


(Mark L. Stone) #2

If X is an n by n matrix to be constrained:
lambda_min(X) >= 1e-7
or
X - 1e-7*eye(n) == semidefinite(n)


(Erling D.Andersen) #3

You can also perturb the optimal X so it becomes postive semidefinite i.e. a small multiple of the indemnity to it.


(Henry Wolkowicz) #4

did you use cvx_precision best that would be the first thing to try. Also showing us the LMI might help. If it is badly conditioned in the sense of not have any strictly feasible points then you will have difficulty finding a good solution with guaranteed nonnegativity.


(Henry Wolkowicz) #5

Also you can try sdpt3 rather than sedumi. sedumi is the default I think for cvx? But sedumi allows for small negative eigenvalues for the dual slacks in the primal-dual embedding.