I would like to solve a minimization wrt a vector w in R^n of a function F(w) := f(w) + g(w), where f(w) is convex (I can write it in CVX) and g(w) is either (i) the 2k symmetric gauge function (Bathia  Matrix Analysis 2013) or (ii) the Ksupport norm (Argyriou, Foygel, Srebro  Sparse prediction with the ksupport norm 2012). I am finding difficulties in writing these two norms abiding to the CVX rules. Their definitions are the following:

2k symmetric gauge norm of a vector w in R^n, given a constant k such that 0 < k <= n:
Let u be the vector containing the coordinatewise absolute values of w.
The 2k norm of w given constant k is defined as the square root of the sum of the squared k largest components of u.
Since I can upload just one image I will put a formula in the comment below. 
ksupport norm of a vector w in R^n, given a constant k such that 0 < k <= n:
the arrow pointing downwards after a vector means the same vector with elements in descending order.
Any idea? Thanks