4 DEANNA NEEDELL AND ROMAN VERSHYNIN
The Restricted Isometry Condition is a natural abstract deterministic property
of a matrix. Although establishing this property is often nontrivial, this task is
decoupled from the analysis of the recovery a lg orithm.
L
1
-minimization is based on linear programming, which has its advantages and
disadvantages. One thinks of linear programming as a black box, a nd any develop-
ment of fast solvers will reduce the running time of the sparse recovery method. On
the other hand, it is not very clear what this running time is, as there is no strongly
polynomial time algorithm in linear programming yet. All known solvers take time
polynomial not only in the dimension of the program d, but also on certain condition
numbers of the program. While for some classes of random matrices the expected
running time of linear programming solvers can be bounded ( see the discussion in
[20] and subsequent work in [23]), estimating condition numbers is hard fo r specific
matrices. For example, there is no result yet showing that the Restricted Isometry
Condition implies that the condition numbers of the corresponding linear program
is polynomial in d.
Orthogonal Matching Pursuit is quite fast, both theoretically and experimentally.
It makes n iterations, where each iteration amounts to a multiplication by a d × N
matrix Φ
∗
(computing the observation vector u), and solving a least squares problem
in dimensions at most N × n (with matrix Φ
I
). This yields strongly polynomial
running t ime. In practice, OMP is observed t o perform faster and is easier to
implement than L
1
-minimization [21]. For more details, see [21].
Orthogonal Matching Pursuit is quite transparent: at each iteration, it selects
a new coordinate f rom the support of the signal v in a very specific and natural
way. In contrast, the known L
1
-minimization solvers, such a s the simplex method
and interior point methods, compute a path toward the solution. However, the
geometry of L
1
is clear, whereas the analysis of greedy algorithms can be difficult
simply because they are iterative.
On the other hand, Orthogonal Matching Pursuit has weaker guarantees of exact
recovery. Unlike L
1
-minimization, the guarantees of OMP are non-uniform: for
each fixed sparse signal v and not for all signals, the algorithm performs correctly
with high probability. Rauhut has shown that uniform guarantees for OMP are
impossible for natural random measurement matrices [18].
Moreover, OMP’s condition on measurement matrices given in [21] is more restric-
tive than the Restricted Isometry Condition. In particular, it is not known whether
OMP succeeds in the important class of partial Fourier measurement matrices.
These open problems about OMP, first stated in [21] and often reverberated in the
Compressed Sensing community, motivated the present paper. We essentially settle
them in positive by the following modification of Orthogonal Matching Pursuit.
1.4. Regularized OMP. This new algorithm for sparse recovery will perform cor-
rectly for all measurement matrices Φ satisfying the Restricted Isometry Condition,
and for all sparse signals.
When we are trying to recover t he signal v from its measurements x = Φv, we can
use the observation vector u = Φ
∗
x as a good l ocal approximation to the signal v.
Namely, the observation vector u encodes correlations of the measurement vector x