Hello, I want to minimize a function below which is a product of two convex increasing functions which should be convex. The problem is that i cannot convert the problem to DCP form. The DCP problem arises when i multiply two convex functions. Any ideas would be appreciated!
Problem is to minimize v_approx as function of w:
log_term1 = -0.2* w + 0.3 *w * w
log_term2 = 0.3 * w * w
Were it not for the -1 in term2, this would be allowed in CVX, due to CVX’s log-convexity rules, as documented at Log of sigmoid function - #3 by mcg . if the -1 were instead any nonnegative constant, it would still be allowed. But the -1 (or any negative value) destroys the log-convexity of term2. In that case, term2 would be convex, but not log-convex, and CVX does not allow log-convex times convex, which I believe is not necessarily convex.
It does appear that term1*term2 is convex, but not for the reasons you state (Edit: see my correction below).
As for CVXPY, which is off-topic for this forum, I am not familiar in detail with its log-convexity rules. But it does have such rules.
Understood. But i am interested in the particular function which has -1 in term2. How can I approach the optimization of term1*term2 within CVX (not cvxpy) or any other convex optimization formalism?
term1*term2 is actually neither log-convex nor log-concave (the sign of the 2nd derivative of log(term1*term2) changes sign at about w = 1.43). Also, it appears term1*term2 may actually be concave. Therefore maximizing its negative would not be a convex optimization problem.