Total-least squares with Regularization

Hello,
I am trying to implement a total least-square with regularization regression with CVX. Can this be done similarly to the LASSO problem, but instead of linear least square regression using total least square regression?

Thanks for your help.
Sev

Does equation (4) in http://people.duke.edu/~hpgavin/SystemID/CourseNotes/TotalLeastSquares.pdf represent the unregularized formulation of your total least squares problem? if so, it is straightforward to enter that into CVX. It should also be strraighforward to add a “reasonable” regularization term to the objective, where “reasonable” means some norm of an affine function of the optimization (decision) varaibles/

Hi Mark,
yes thanks. That document describes the total least-square in (4). Would the cvx implementation of this look like:

Assume a simple one dimensional problem in the form of

Yactual+tildeY = (Xactual + tildeX)*a (a being the scalar to be estimated)

Given are vectors of n=1000 measurements X and Y with errors. Then,

cvx_begin
variable tildeX(1000,1)
variable tildeY(1000,1)
variable Xactual(1000,1)
variable Yactual(1000,1)
variable a(1,1)

minimize(sum(sum_square([tildeX tildeY])))
subject to

Yactual+tildeY == (Xactual + tildeX)*a 
Yactual+tildeY == Y
Xactual + tildeX == X

cvx_end

Sorry, I messed up. I neglected to consider that a needs to be an optimization (a.k.a. decision, a.k.a. CVX) variable, in addition to tildeX and tildeY. Therefore the constraint Yactual+tildeY == (Xactual + tildeX)*a is nonlinear, non-convex, and not allowed by CVX. I don’t know whether there is an alternative formulation which would be a convex optimization problem, but not that I can think of.