Convert to DCP a product of two increasing convex functions

Hello, I want to minimize a function below which is a product of two convex increasing functions which should be convex. The problem is that i cannot convert the problem to DCP form. The DCP problem arises when i multiply two convex functions. Any ideas would be appreciated!

Problem is to minimize v_approx as function of w:

log_term1 = -0.2* w + 0.3 *w * w
log_term2 = 0.3 * w * w

term1=cp.exp(log_term1)
term2=(-1 + cp.exp(log_term2))
v_approx =term1*term2

find w such that cp.Maximize(-v_approx,w)

Were it not for the -1 in term2, this would be allowed in CVX, due to CVX’s log-convexity rules, as documented at Log of sigmoid function - #3 by mcg . if the -1 were instead any nonnegative constant, it would still be allowed. But the -1 (or any negative value) destroys the log-convexity of term2. In that case, term2 would be convex, but not log-convex, and CVX does not allow log-convex times convex, which I believe is not necessarily convex.

It does appear that term1*term2 is convex, but not for the reasons you state (Edit: see my correction below).

As for CVXPY, which is off-topic for this forum, I am not familiar in detail with its log-convexity rules. But it does have such rules.

Understood. But i am interested in the particular function which has -1 in term2. How can I approach the optimization of term1*term2 within CVX (not cvxpy) or any other convex optimization formalism?

Maybe you can’t/ Unless some other forum reader figures out how to (which might not be until Monday, if ever).

CVX can’t handle all convex optimization problems.

Let me correct an earlier misstatement by me.

term1*term2 is actually neither log-convex nor log-concave (the sign of the 2nd derivative of log(term1*term2) changes sign at about w = 1.43). Also, it appears term1*term2 may actually be concave. Therefore maximizing its negative would not be a convex optimization problem.