The following simple CVX procedure implements a Dantzig selector for a linear regression problem min || X*beta -y ||. Since the minimization is of a 1-norm with an infinity norm constraint, this can be transformed into an LP. I was surprised to see that CVX actually transformed this into an SOCP rather than an LP. Why does CVX choose an SOCP formulation over an LP formulation?

```
%
% First, get the problem size.
%
[n,p]=size(X);
%
% Now, solve using CVX.
%
cvx_begin
cvx_precision high
variable betadantzig(p)
minimize norm(betadantzig,1)
subject to
norm(X'*(X*betadantzig-y),Inf)<= delta
cvx_end
optval=cvx_optval;
```