I apologize in advance if I’m doing something extremely silly here, but I can’t figure out what is causing some weird behaviour on one of my CVX problems. Here is a minimal CVX example that displays the weirdness that I’m encountering:
cvx_begin sdp quiet
variable X(2,2) hermitian
trace(X) == 1;
X >= 0;
X(1,2) == 0.2*1i;
The above code should return a value of 1, since the nuclear norm of every positive semidefinite matrix is just equal to its trace, which is constrained to be 1 here. However, when I run this code (on MATLAB R2013a, using CVX 2.1), I get a value of 1.4000 in cvx_optval. Nonetheless, subsequently running
on the output of the above CVX problem gives a value of 1.000, as expected. Where is the 1.4000 coming from?
The problem seems to be with the 0.2*1i constraint. If I change that to a real number, then this problem goes away. Also, I get the same weirdness whether I use SDPT3 or SeDuMi as the solver.
Sorry to bump this topic, but is there any update or something obvious that I’m missing for this? Over 2 years later, this is still causing me major problems. Minimizing pretty much anything involving norm_nuc or lambda_sum_largest involving complex Hermitian matrices gives incorrect results, regardless of which solver I use. But putting the same model into YALMIP produces correct results.
Thanks Mark. I’m running CVX 2.1 build 1110, but under a much older version of MATLAB (r2008b). I’ll assume for now that the old version of MATLAB is the culprit, and try to get my hands on a newer one.
I cannot promise that the older version of MATLAB is the culprit, but what I can tell you is that I can’t afford to verify correct operation on MATLAB versions that old. I try to go 5 years back but even that is difficult.