Disciplined convex programming error

            minimize lambda_max(X)/lambda_min(X)                     
            subject to 
                    X - AT*X*A <0          
                    X>0

how do I formulate this in CVX?

It always gives me error saying disciplined convex programming error as it is a convex term divided by convex.

I also tried

                    minimize t             
                    subject to lambda_max(X) - t*(lambda_min(X))>=0
                               X>0
                               X - AT*X*A <0

It still gives the same discipline error.

That’s because this problem is not convex. The rules in the user guide are non-negotiable.

However, it is quasiconvex. That means, for fixed t>0, you can solve this feasibility problem:

cvx_begin sdp
    variable X(n,n) symmetric
    lambda_max(X) <= t * lambda_min(X)
    X <= A' * X * A
    X >= 0
cvx_end

Note that t is not a variable; that cannot be changed. You can, however, determine the optimal t using bisection.

Keep in mind, however, that this problem is homogeneous. So X=0 is an answer. Strict inequalities don’t work on CVX. You’ll have to do something about this; see Strict inequalities in the users’ guide. For this particular problem, my recommendation is to add this constraint:

    trace(X) == 1