I’m given 2016 elements, and I take these elements and create a graph with the edges determine by the distance of the elements. If the distance of two elements is precisely 2, I give a edge between them. Now, I try to estimate the number of elements in the maximum independent set of the graph using the Lovasz Theta Number. However, the Program runs about 2 hours and send back the following:

Unexpected Standard exception from MEX file.
What() is:bad allocation
…
I see that this is due to a memory issue, but how can I modify the program so it is able to solve this graph.

The following is my code:

close;clear all;clc
space = xlsread(‘D:\Code\space.xlsx’);
distmat = zeros(2016);
for i=1:2016
for j=1:2016
distmat(2016*(i-1)+j) = sum(abs(space(i,:)-space(j,:)));
end
end
cvx_begin SDP
variable B(2016,2016) symmetric
maximize (sum(sum(B)))
subject to
B >= 0;
sum(diag(B))==1;
for i=1:2016
for j=1:2016
if distmat(i,j)==2
B(i,j)==0;
else
end
end
end
cvx_end

Interesting. Your SDPT3 log seems to suggest that you have around 44000 edges, so pretty sparse for a graph on 2016 vertices, is that correct?

I generated a random graph with 2016 vertices and 44000 edges and MOSEK went through the first few iterations showing the usage of 5% of the 512GB RAM machine where I’m running this, so around 25GB. So it is possible you are pretty close. Maybe if you turn multithreading off it will fit, you could try that.

The complexity and memory will grow very quickly as you increase the number of edges, since that translates directly on the number of linear constraints involving semidefinite terms. I don’t think computing the theta number of a general graph on 2016 vertices is within reach at all.