# How can I express this optimization problem in CVX?

The optimization problem is as below

Here s are the optimization variables and h and w are vectors of compatible sizes.
Let us consider, the parameters \alpha, \beta, \phi, h and w are known.

Then how can I express this in CVX

\exp\left(\sum\nolimits_{t=1}^{T}s_{t}{{\alpha}_{t}}\right)

subject to

\frac{1}{N}\left(\frac{{\exp (s_{t}{{\alpha}_{t}})}-1}{\phi_{t}}+\phi_{t}\beta_{t}^{2}\right)\leq{{{h}}_{t}^{\rm{H}}}{{{w}}_{t}}

You can let CVX invoke its successive approximation method (with appropriate warnings and caveats) for exp in both the objective and constraints and write the CVX code virtually identically to your formulation above, presuming that phi > 0 , which is necessary for the constraints to be convex.

% h and w of compatible dimensions with code below
cvx_begin
variable s(T)
minimize(exp(alpha'*s))
for t=1:T
1/2*((exp(alpha(t)*s(t))-1)/phi(t)+phi(t)*beta(t)^2) <= h(:,t)'*w(:,t)
end
cvx_end


Any vectorization is left as an exercise for the OP

With the transformation

s_{t}:=\log(r_{t})

We can equivalently express the optimization problem as

maximize {{\left( \prod\nolimits_{t=1}^{T}{r_{t}^{{{\alpha }_{t}}}}\right)}^{\frac{1}{\sum\nolimits_{t=1}^{T}{{{\alpha }_{t}}}}}}

subject to

\frac{1}{N}\left(\frac{{ (r_{t}}-1)}{\phi_{t}}+\phi_{t} \beta_{t}^{2}\right)\leq{h}_{t}^{\rm{H}}{{{w}}_{t}}

Which we call Weighted Geometric Mean optimization. CVX internally transforms this into an SOCP.

Then which method has lower computational complexity?