I would like to solve a minimization wrt a vector w in R^n of a function F(w) := f(w) + g(w), where f(w) is convex (I can write it in CVX) and g(w) is either (i) the 2-k symmetric gauge function (Bathia - Matrix Analysis 2013) or (ii) the K-support norm (Argyriou, Foygel, Srebro - Sparse prediction with the k-support norm 2012). I am finding difficulties in writing these two norms abiding to the CVX rules. Their definitions are the following:
2-k symmetric gauge norm of a vector w in R^n, given a constant k such that 0 < k <= n:
Let u be the vector containing the coordinate-wise absolute values of w.
The 2-k norm of w given constant k is defined as the square root of the sum of the squared k largest components of u.
Since I can upload just one image I will put a formula in the comment below.
k-support norm of a vector w in R^n, given a constant k such that 0 < k <= n:
the arrow pointing downwards after a vector means the same vector with elements in descending order.
Any idea? Thanks