Using CVX to solve a convex maximization problem iteratively but yielding decreasing answers

It appears you are trying to solve a (non-convex?) problem via an iterative series of (convex) CVX problem solutions, perhaps along the lines of stephen_boyd’s answer in How to handle nonlinear equality constraints? ?. Start by reading that, if you haven’t already. So of course the disclaimers there apply. But you have 120 turned around inequalities? Have you considered the possibility that given your initial value of x_ini, that your algorithm, even if nominally correct in some sense, is diverging, and if so, have you tried other initial values? If your actual problem is non-convex, the best you can hope for in an iterative approach such as this is to find a local optimum, and the local optimum found, if any, may depend on the initial value. What is the actual problem you’re trying to solve via this iterative approach?

Does your actual problem have the constraint
xE(:,:,k)(x’) == d for k=1:120?
That is of course non-convex. If that is what you are trying to deal with via iterating on (convex) inequalities, then why do you have a multiplier of 2 on your >= constraint (and not dividing at the iteration by the appropriate norm)? Is it even feasible, given the requirement to be satisfied simultaneously for 120 values of k. So I withhold judgment until you tell us what you’re actually trying to solve, but it looks nasty.

CVX is only addressing convex problems, so barring solution (numerical) difficulties, it (solver) should find the global optimum, as indeed convex problems have no non-global local optima. So I presume the actual reason for your iterative approach is that your actual problem is non-convex. Also, given that you are getting results you don’t understand, you may want to not use the quiet option until you do understand what’s going on.