# Writing 1/(x*log(1+1/x))

Hi guys,
I met a problem while using cvx. My constraint function is f(x) =1/(x*log(1+1/x)),x>0. The function is convex, but I don’t know how to express which in cvx. Can you help me?
Thanks!

f(x) <= constant

or

f(x) <= something

?

For x > 0

`x*log(1+1/x)` can be rewritten as `-rel_entr(x,x+1)`, which is a concave expression.

`1/(x*log(1+1/x))` can be rewritten as `inv_pos(-rel_entr(x,x+1))`, which is a convex expression.

1 Like

My constraint function is sum_i {((a_i)*f(x_i))}<=C, where constant a_i >0 and C>0, which like the positive weighted sum of f(x i) is less than or equal to a positive number. And for each i, f(x_i)=1/(x_i * log(1+b_i/x_i)), here b_i is a positive constant and the optimization variables are positive real numbers x_i.

Thank you very much, now it works.

make `b` an n by 1 vector.

``````variable x(n)
sum(inv_pos(-rel_entr(x,x+b))) <= C
``````

help rel_entr

rel_entr Scalar relative entropy.
rel_entr(X,Y) returns an array of the same size as X+Y with the
relative entropy function applied to each element:
{ X.*LOG(X./Y) if X > 0 & Y > 0,
rel_entr(X,Y) = { 0 if X == 0 & Y >= 0,
{ +Inf otherwise.
X and Y must either be the same size, or one must be a scalar. If X and
Y are vectors, then SUM(rel_entr(X,Y)) returns their relative entropy.
If they are PDFs (that is, if X>=0, Y>=0, SUM(X)==1, SUM(Y)==1) then
this is equal to their Kullback-Liebler divergence SUM(KL_DIV(X,Y)).
-SUM(rel_entr(X,1)) returns the entropy of X.

Now I see there is `a_i`. So make `a ` also an n by `1 vector.

``````variable x(n)
a'*inv_pos(-rel_entr(x,x+b)) <= C``````