As part of a larger optimization, I would like to enforce an orthogonality constraint on two vector variables. The two vectors correspond to the rows of X, such that G = XX’ and G is diagonal. If the equality is held, the rows are orthogonal. What I’ve done so far is to formulate the constraint as an SDP. This seems to work “most of the time” - but not always! In particular, in the code below, orthogonality is only enforced when “sum_a_lower_bound” is less than about 16.5, AND with the choice of objective: minimize trace(G). My questions are:
- Under what conditions would the equality of G=X*X’ be effectively enforced to thereby enforce orthogonality?
- Why is “minimize trace(G)” needed? Any way to drop this?
- Is there a simpler way to enforce orthogonality of two vector variables in a convex framework (preferably a constraint that is independent of the objective). There is another approach that I can think of which basically amounts to: ||a+v|| <= t, AND ||a-v|| <= t, then minimize t. Which works, but becomes problematic when there are other quantities to be minimized simultaneously in the objective.
Here’s my working code snippet. Thanks in advance!
n=2;
sum_a_lower_bound = 16.5; % seems to work only when <= 16.5 in this example.
cvx_begin sdp
variable G(n, n) diagonal; % gram matrix
variable X(n, 3);
expressions v(1,3) a(1,3);
v = X(1,:);
a = X(2,:);
minimize trace(G) % minimize nuclear norm to encourage low-rank
subject to
[G, X; X', eye(3)] >= 0; % using Schur complement to relax G = X*X' with G >= X*X'
v == [8,-2,3]; % place-holder for more complex constraints on v
sum_a_lower_bound <= sum(a); % place-holder for more complex constraints on a
cvx_end