Controlling subset quadratic term

I have no idea what you’re asking. I’ll let Johan handle it over at!topic/yalmip/F5_hbwHEH0Y .

Thank you - I just needed a formal way to tell the optimizer to penalize the risk associated with the \bar{x}, \bar{\Sigma} terms more vs. the full x’ \Sigma x … wasn’t sure how to do this.

You can do something like
x'*Sigma*x + lambda*xbar'*Sigmabar*xbar
for some specified value of lambda > 1, for instance, 2 or 5 or 10. You can pick a value of lambda out of the air, or do some type of cross-validation to choose the value of lambda.

But as to whether any of this is the “risk” you are talking about, is a modeling question, which is your responsibility.