Symmetric positive semidefinite constraint in cvx

I am considering the following convex optimization problem:

Let y be a column vector of dimension N and H be a matrix of dimension N x M. I want to find A such that ||y - H vec(A) || is minimized where vec(A) is the vectorized form of A. I am constraining A to be a symmetric positive semidefinite matrix.

I was able to solve it using cvx following the code here. However, the solution I get is non-unique. (I happen to know my true solution).

Every row of H is the vectorized version of the matrix z’*z where z is a column vector. Because of this symmetry in H, H is not full rank as some columns are identical. But If I rewrite the optimization problem such that instead of vec(A), I only consider the terms in A which are in the upper diagonal, then the resulting H can be made full rank. I can then obtain a unique solution.

How can I formulate this in cvx?

Not sure I understand your question. You can use triu(A) and vec(triu(A)) in CVX It really comes down to what optimization problem you are trying to solve. Whatever you do, it needs to be conformal, i.e., with compatible dimensions.

Hi Mark,

Nice suggestion – I didn’t know that. I’ll try vec(triu(A)) and see if it makes a difference!

Note that triu(A) and vec(triu(A)) will have zeros in what was the lower triangle of A, so you’ll have to deal with that. If you figure out what calculation to do with A, presuming A is a MATLAB variable, then you should be able to do the same thing with A being a CVX variable.

Hi Mark, Yes I agree. I found a way to extract the non-zero elements from a lower/upper triangular matrix. Surprisingly, cvx gives the same answer whether or not I exploit this simplification. Thanks for the help, though!