# Convex optimization

I want to minimize a problem defined as follows:

cvx_begin quiet
cvx_solver mosek
variable x(H*Max_nom,1)

minimize (sum(a linear part) + sum(max(0,pow_p((x.*ax_k) -landa_max_k,3)./(sqrt(pow_p((x.*ax_k)-landa_max_k,2)+pow_p(rho,2))))) )

cvx_end


ax_k and landa_max_k are vectors with the same size of x, and rho is scalar. I have some errors in using “./” and “sqrt”. Further, I can’t use norm(.) because I need to compute each entry and then take “sum(.)” over them. How can I solve this problem?

Kind regards,

(Mark L. Stone) #2

Have you proven that your objective function is convex? If so, how?

Yes, please kindly note that the applied penalty function is max(0,x^3/sqrt(x^2+a^2)).

(Mark L. Stone) #4

You don’t seem to have a constructive convexity proof employing the DCP convexity composition rules.

But my problem is composed of two terms. A linear part plus a convex part. It seems to make a convex problem. Am I wrong?

(Mark L. Stone) #6

You need to be able to construct the convex portion using CVX’s DCP rules. CVX can not handle all convex optimization problems.

Thank you Prof. Stone. Do you have any idea to change the problem’s structure or use another optimization toolbox?

(Mark L. Stone) #8

I am not a Prof. In any event, I don’t know how to structure your problem so that CVX will accept it. That doesn’t mean no one else does (presuming it really is convex). Of course, changing your objective function to something else which is convex, but can be entered into CVX, is within your purview, because the readers of this forum don’t understand what (real-world) problem you are trying to solve, nor is that within scope of this forum.

My apology. Thank you so much Dr. Stone.
Best regards,

(Erling D.Andersen) #10

Do you have nicely formatted version of your problem that you can show us. For instance typed up in LaTex.

Yes, I have.

min_{x} P^\top\cdot x + \eta \sum_{x} \max{0, \frac{{\big((x\cdot ax_k)-\lambda^{max}{k}}\big)^3}{\sqrt{{\big((x\cdot ax_k)-\lambda^{max}{k}\big)^2}+\rho^2}} }

where, P, x, ax_k, \lambda^{max}_{k} \in \mathbb R^{M} are vectors and \eta , \rho are scalars.