Hi, how to write this as the CVX form? \mathop {\min }\limits_{\mathbf{R}} \sum\limits_{i = 1}^{L - 1} {\sum\limits_{j = l + 1}^L {{{\left| {{{\mathbf{a}}^H}\left( {{\theta _i}} \right){\mathbf{Ra}}\left( {{\theta _j}} \right)} \right|}^2}} } , where R is a positive semi-definite optimization matrix and {{\mathbf{a}}\left( {{\theta _j}} \right)} is a constant vector. Thank u for any one’s help.
square_pos(a(j)'*R*a(j))
or use quad_form
/
The rest is a matter of correct indexing and summation which I will leave to you. Presumably there is some dependency on i
, otherwise why would there be a sum over 'i
?
You can build up the Objective in a for loop, such as
Objective = 0;
for i=1:L-1
for j=l+1:L
Objective = Objective + ....
end
end
minimize(Objective)
Thank u, Mark. It works!
I just looked at this again, and now see that it involves
a'(\theta_i)*R*a(\theta_j)
In particular, it looks like a different variable vector (transpose) is premultiplying R than the vector postmultiplying it. In simpler notation, that would be like x'*R*y
, where x
and y
are both variables. If so, that would be non-convex,
But it is your problem, and I don’t know what the notation means. SoI leave the determination to you.