How to handle the objective function with the form of x * log (C.*Y/x)

x, Y is variables, where x is a scaler and Y is a vector. C is a constant vector. The aim of Optimization problem is to maximize sum(x*log(C.*Y/x)), how to express the objective function in CVX? I have proved that the objective function is concave and try it with -rel (x, x + Y. * C) , but failed.

x*log(C*Y/x) = -rel_entr(x,C*Y)

The formulation you tried is for x*log(1+C*Y/x)

Oh, thank you very much. I left out + 1 in the problem description. Now I understand. But in my problem, C is a vector and Y is a variable vector, so can I describe “sum(x * log (1 + C .*Y/x))” as “-rel (x, x + C.*Y)” in CVX ? Both C and Y are N×1 column vectors

help rel_entr

rel_entr Scalar relative entropy.
rel_entr(X,Y) returns an array of the same size as X+Y with the
relative entropy function applied to each element:
{ X.*LOG(X./Y) if X > 0 & Y > 0,
rel_entr(X,Y) = { 0 if X == 0 & Y >= 0,
{ +Inf otherwise.
X and Y must either be the same size, or one must be a scalar. If X and
Y are vectors, then SUM(rel_entr(X,Y)) returns their relative entropy.
If they are PDFs (that is, if X>=0, Y>=0, SUM(X)==1, SUM(Y)==1) then
this is equal to their Kullback-Liebler divergence SUM(KL_DIV(X,Y)).
-SUM(rel_entr(X,1)) returns the entropy of X.

Disciplined convex programming information:
    rel_entr(X,Y) is convex in both X and Y, nonmonotonic in X, and
    nonincreasing in Y. Thus when used in CVX expressions, X must be
    real and affine and Y must be concave. The use of rel_entr(X,Y) in
    an objective or constraint will effectively constrain both X and Y 
    to be nonnegative, hence there is no need to add additional
    constraints X >= 0 or Y >= 0 to enforce this.