Hello, I don’t know where I am wrong in the code, here stdev=(x’ * Sigma * x)^2, but because I don’t know how to calculate Sigma, so I use maximize(avg*x-Delta * stdev)

I think you forgot to multiply stdev by x. That would make the objective function a scalar.

The resulting model seems mathematically “immature” (i.e., lousy) from a mathematical finance perspective, but I’ll leave it to your determination what model your school assignment is supposed to use (but don’t expect me as a portfolio optimization client).

That is the simplest fix (but still “wrong”) to your “incorrect” way of computing portfolio standard deviation, which perhaps is supposed to be “Risk”?

You perhaps really want delta*norm(stdev'.*x), using you stdev array, not the formula you show for stdev (which should have sqrt, not ^2), But it is for you to learn proper theory and calculation, which are separate from optimization knowledge. You ought to learn what Sigma (covariance matrix) is, then see how that produces an equivalent result to what I showed, given the assumed independence of the asset returns.

Edit: I inserted a missing '. in the argument of norm to make that argument a vector.