I am curious how to implement the matrix (AX), as a first step of the optimization objective function.

A is some 4-dimensional numerical tensor, where X is the the PSD relaxation of the gram matrix x^T x, so X is the variable. (AX)_{kl} is an element in the matrix (AX), and (AX)_{kl} = tr(A_{kl} X). A_{kl} is a slice of A, and a matrix.

In other words, (AX) is a matrix of traces in the form (A X)_{kl} = tr(A_{kl} X).

Figure out how to evaluate (write the MATLAB code) for the expressions of interest if everything were MATLAB variables (double precision numbers). Presuming you don’t make use of implicit expansion, which is not supported by CVX, it should also work when some of the MATLAB variables are replaced by CVX variables.

We’ll leave you with the task of correctly deciphering whatever messy notation is in the tensor formulation, and how to translate that to MATLAB code.

CVX will let you do things such as

A = i by j by k by l double precision array input
variable X(i,j,k,l)
sum(A(:).*X(:))

Thank you for your reply. Here is what I want to do

Suppose I have a four dimensional tensor A, the objective is \sum_{ijkl} A_{ijkl} X_{ij}Y_{kl}. This can be decomposed to a two step sum: \sum_{ij} X_{ij} (\sum_{kl}A_{ijkl} Y_{kl}). To express it, I will have: first sum k,l, then sum i,j. In other words, for the first step, if I fix indices i,j, I will have

sum A(i,j, :, :).*Y(:, :)

However, in CVX language, I am not sure how to fix the index i,j first:

I want to have

trace (X*(sum A(i,j, :, :).*Y(:, :))

in other words,

sum (X(:, :). * (sum A(i,j, :, :).*Y(:, :))

It is not clear to me how to provide indices i, j from the outer sum, Is it possible to express this two-step sum objective in CVX?