I have a matrix as
A = \begin{bmatrix}
D & v\\
v^H & \alpha
\end{bmatrix}
,
where D is a block diagonal matrix, with some of its blocks being zero matrices. Also, v is a vector with almost the upper half part being zeros, and \alpha is a constant. The matrix A has to be a hermitian semidefinite matrix as a constraint of a convex optimization problem, so the eigenvalues should be evaluated. The problem with this constraint takes a lot of time to be solved with MATLAB CVX algorithms. As an example, the dimensions of A starts from 1500x1500. I want to know how I can deal with this sparse matrix in constraints and reduce the computation complexity.

This is the constraint:
A == hermitian_semidefinite(dim(1)+1)

I would greatly appreciate it if anyone can help me.

To achieve sparsity you have to introduce additional constraints of the for

X_ij = 0

I suppose. If you have lot of them, they will make it computational costly to solve the problem.
It might be possible to reformulate your problem to work around that issue.

Also I suggest

You try different optimizers such as Mosek, SeDuMi, and SDPT3.

Okay, let me clear it out more. I have built matrix D as a block diagonal matrix which consists of two variables. So, it is known to the solver that the zero elements are actually zero. right? Is it needed to have them defined again as zeros in the constraints? The zeros in the vector v are known as well. The corresponding elements are absolute zeros and do not contain any variable. Therefore, I think it is not necessary for them to be constrained.

I get much faster results with Mosek. A sample log is shown below.

Calling Mosek 9.1.9: 2253068 variables, 5 equality constraints
For improved efficiency, Mosek is solving the dual problem.

MOSEK warning 710: #2 (nearly) zero elements are specified in sparse col ‘’ (1) of matrix ‘A’.
MOSEK warning 710: #2 (nearly) zero elements are specified in sparse col ‘’ (2) of matrix ‘A’.
MOSEK warning 710: #1 (nearly) zero elements are specified in sparse col ‘’ (3) of matrix ‘A’.
MOSEK warning 710: #2 (nearly) zero elements are specified in sparse col ‘’ (4) of matrix ‘A’.
MOSEK warning 710: #1 (nearly) zero elements are specified in sparse col ‘’ (5) of matrix ‘A’.
MOSEK warning 710: #1 (nearly) zero elements are specified in sparse col ‘’ (6) of matrix ‘A’.
MOSEK warning 710: #1 (nearly) zero elements are specified in sparse col ‘’ (7) of matrix ‘A’.
MOSEK warning 710: #2 (nearly) zero elements are specified in sparse col ‘’ (8) of matrix ‘A’.
MOSEK warning 710: #1 (nearly) zero elements are specified in sparse col ‘’ (9) of matrix ‘A’.
MOSEK warning 710: #1 (nearly) zero elements are specified in sparse col ‘’ (10) of matrix ‘A’.
Warning number 710 is disabled.
Problem
Name :
Objective sense : min
Type : CONIC (conic optimization problem)
Constraints : 5
Cones : 1
Scalar variables : 58
Matrix variables : 2
Integer variables : 0

Optimizer started.
Presolve started.
Linear dependency checker started.
Linear dependency checker terminated.
Eliminator started.
Freed constraints in eliminator : 0
Eliminator terminated.
Eliminator started.
Freed constraints in eliminator : 0
Eliminator terminated.
Eliminator - tries : 2 time : 0.00
Lin. dep. - tries : 1 time : 0.00
Lin. dep. - number : 0
Presolve terminated. Time: 0.05
Problem
Name :
Objective sense : min
Type : CONIC (conic optimization problem)
Constraints : 5
Cones : 1
Scalar variables : 58
Matrix variables : 2
Integer variables : 0

“it is known to the solver that the zero elements are actually zero” because CVX passes this information to the solver in the form of the constraints X_{ij}=0 as written by Erling. Otherwise how would the solver know? Without any information the solver would consider these elements to be unrestricted - no bounds.

However in this case this is probably not relevant, because as you can see from the log CVX dualized the problem before calling MOSEK so everything the solver sees is dual of what you might think, and in particular it may happen that the number of constraints is significantly reduced. This maybe saves the day. See Example 8.7 in https://docs.mosek.com/modeling-cookbook/duality.html#semidefinite-duality-and-lmis for this type of trick and why it might be relevant. You can also dualize your model yourself and input that to see how it performs directly.

Thank you for your comprehensive response. Sure, I will do your suggestions.

Btw, isn’t there a way to mathematically reduce dimensions of such sparse matrices, without losing a considerable amount of information (In this case, the sign of eigenvalues as the constraint is tackling semidefiniteness)?

There are two matrix variables in the problem Mosek sees.

You have to reformulate the problem so the size of those variables gets smaller without introducing too many constraints. That may or may not be possible.

Using Mosek v10 and a faster computer is likely the best way to speed up the solution process.

If you want a better assessment of what CVX does or does not provide to the solver, you should show us your CVX code, preferably reproducible, with input data if possible. As @Michal_Adamaszek points out, CVX provided the dual to the solver, as shown at the beginning of the output, so that complicates things. Also, you should pay attention to Mosek’s warnings about near-zero elements.

When formulating your problem, you can use concatenation, which often works well for many types of structured sparsity. Individual (matrix) variables can be declared with any combination of
keywords:

as discussed in http://cvxr.com/cvx/doc/basics.html#variables . CVX does not support blkvar, but YALMIP does, so you might be better of using YALMIP, although I think you can “manually” achieve the effect of what blkvar does by concatenating the constituent diagonal matrices and zeros matrices of appropriate dimensions. I don’t know what CVX is doing behind the scenes in all cases when these keywords are used. But if they are applicable, you should probably use them, in combination with building up the higher level matrices from these building blocks using concatenation.