I have an optimization problem
min(f(x))
s.t. ||x|| < 1
Where f is convex function (logarithm of sigmoid functions). The norm is L2-norm.
I wonder what methods available for this type of optimization? I have tried SLSQP and COBYLA but they seem to be very heavy solving more general problem, with a function on the inequalities and equalities, while in my case I just care for the norm.
Are there some methods that could be more light-weight (in particular, if I can integrate them with Gradient Descent) and at the same time preserve the constraints?