My optimization problem is something like this:
\min_X \quad \lambda*TV(X)+ \Sigma_{k=1}^{N}\|P_k\|_{*} \\ \text{s.t.}\quad X_{i,j}=A_{i,j}, {i,j}\in \Omega
Here, A_{400\times 400} is a known matrix, X is the variable with the same size of A and P_k(k=1,2,\cdots,N) are all the 5\times 5 patches of X.
This is obviously a convex optimization problem, so the first thing came to my mind is CVX(great tool!)
I’ve tried SeDuMi ,SDPT3 and SCS, sedumi seems failed, SDPT3 is way too slow to converge, SCS is the best choice but still too slow:
After CVX converts this problem into SCS format, the console prints:
Calling SCS 1.2.2: 3809909 variables, 1836489 equality constraints
The total time for solving is 2h 20min. I even measured the time for each part, that is,
1h for objective function generation and problem converting,
20min for iteration and final convergence,
1h for converting SCS results back to the original problem results.
This is even after I reduced 60% of the patches. For 37% reduction of the patches, it costs 5h 30min to solve it.
What should I do? I haven’t tried MOSEK, Gurobi, GLPK and ECOS. Should I try them?
Besides, I’ve found an old post in this forum Nuclear norm minimization with tfocs, the OP asked if TFOCS can solve it. Can it?
Again, my main question is: for such problem, what is the fastest way to solve it currently?
Any