Quadratic Equality Constraints

I’ve found that CVX is accepting my quadratic equality constraints and outputting a feasible and optimal answer.

The following paper on Disciplined Convex Concave Programming (DCCP) provided a guess as to why this may be:

I assume that CVX is accepting the quadratic equality constraint based on the DCCP rule set (rather than the DCP rule set), and solving the problem using concave-convex programming?

If that is not true, please let me know. If that is true, can anyone point me in the direction of a resource that would help me establish when quadratic equalities are guaranteed to be solvable using Sequential Convex Programming? I’ve been using the following lectures notes thus far: https://web.stanford.edu/class/ee364b/lectures/seq_slides.pdf

CVX does not implement DCCP. But there is a DCCP package for CVXPY, as described in the paper you linked.

Please show a complete (reproducible with all input data if possible) example. Please include the output of cvx_version .

Is there a zero coefficient or something which is perhaps getting simplified at an early state of processing so as to not render it a quadratic equality constraint per CVX?

Yes, that was exactly what was happening. Thank you for clarifying that CVX does not implement DCCP to help me find the bug.

If I would like to implement an optimization with a quadratic equality constraint, is it correct that I will need to implement it with the DCCP package in CVXPY, and that there is not a corresponding package for CVX in Matlab?

Assuming the quadratic equality constraints satisfy the DCCP ruleset, will CVXPY automatically solve the problem using Sequential Convex Programming?

I don’t know of any implementation of DCCP in CVX, or any plans for one to be developed. Maybe you can take it upon yourself to add this - I’'m not saying it would be easy or possible, but the CVXQUAD https://github.com/hfawzi/cvxquad developers developed quite the add-on for CVX, so maybe you could too. I recommend you ask the DCCP authors about any plans for CVX.

As for how the DCCP package for CVXPY works, I will defer to the developes, and the CVX forum is not the place to find them hanging out. (Stephen Boyd hasn’t been seen (logged in) on the CVX forum subsequent to his post, linked below, in Sep 2012 - he’s not a big fan of MATLAB these days).

As for CVX, I can recommend you to Stephen Boyd’s Sep 2012 CVX Forum post How to handle nonlinear equality constraints? .for how to write a loop to handle equality constraints, specifically, how to handle an inequality constraint going the wrong way, which would be added to one going the convex way, to constitute the equality constraint. Of course your quadratic must be either convex or concave rather than indefinite for this to work.

Anyhow, I fail to see the fascination with the convex-convace procedure or Sequential Convex Programming (SCP). Sequential Quadratic Programming (SQP) in conjunction with BFGS Quasi-Newton update of the Hessian of the Lagrangian is a special case of Sequential Convex Programming. Any SQP implementation which is not safeguarded, such as by trust region or line search, is considered unreliable, even when applied to convex problems, The “Boyd” style SCP write ups I’ve seen, and what presumably gets implemented by the typical person not experienced with non-convex nonlinear optimization, is a simplistic totally unsafeguarded implementation. If you used a high quality non-convex nonlinear numerical optimizer, for instance, based on SQP, it may or may not be a SCP, but in any event will take care with implementation of safeguarding, and should be more reliable than a crude home-brew SCP. You may also have the option, at least for small enough problems, of using a (sort of ) rigorous global optimizer, such as BARON, which will accept quadratic inequalities, even for indefinite quadratics.

SCP may not converge to anything, and if it does, it is not necessarily even a local minimum of the original problem.

Thank you Mark, that’s very helpful.

Particularly the thoughts regarding the tradeoff between SCP and other non-convex optimizers. My hope was that the convex-concave procedure (CCP) is well-suited for problems in which the nonconvexities are restricted to concave inequalities (that come from the quadratic equalities) and perhaps some guarantees regarding convergence/global optimality could be made.

That hope came from discovering the DCCP conditions and drawing an (incorrect) conclusion that, similar to the DCP conditions, the DCCP conditions provide guarantees about the solution. Instead, it appears as though the DCCP conditions only indicate that a problem is well suited to be (heuristically) solved with the CCP algorithm, because iteratively linearizing the nonconvex constraints create subproblems that satisfy the DCP constraints:

“DCCP problems are an ideal standard form for DC programming because the linearized problem in algorithm 1.1 is a DCP program whenever the original problem is DCCP. The linearized problem can thus be automatically converted into a cone program and solved using generic solvers.”

I’ll keep looking at it and will update this thread if I come across a result regarding the global optimality of the CCP for quadratic equality constraints for some set of subproblems, though I am not expecting one.

Thanks again