I am solving a convex program that is exactly the same as the one presented at this link. Namely, let A be a symmetric positive semidefinite matrix. I want to minimize H vec(A)  Y where vec(A) is the vectorized version of A and Y is a column vector.
I am experiencing 2 problems:

In my situation, I know the true value for A (and it satisfies the symmetric positive semidefinite constraint). When I plug this A into the objective function, I get a lower objective value than what is returned by CVX.

the optimal value by cvx, namely, cvx_optval is not exactly the same as what I would get by computing norm(H*A(:)Y) in matlab. cvx_optval is 2.5175e10 while Matlab provides me with 1.3138e10. Any insights on the issue?
The difference between 2.5175e10 and 1.3138e10 is just solver tolerance “noise”; perhaps the exact optimal value is zero.
The solvers called by CVX only solve, even when reporting optimal solution found, to within a solver tolerance.
And on some problems, the argmin is not unique, even though the optimal objective value is unique o within solver tolerance.
If that doesn’t address your concern, then please provide more details, including exact output.
Hi Mark,
Thanks for the quick reply. I agree that a convex program does not imply a unique solution.
The thing is, when I compute norm(H*A(:)Y) in matlab where A is the true matrix, the objective value is 1e15 which is much much lower than the objective value of cvx. Can this difference in the objective value magnitude still be attributed to solver tolerance noise?
1e15 is not much smaller than 2e10 in this context. Both are essentially zero, so it might just be a roundoff and transformation thing.