Sparser Johnson-Lindenstrauss Transforms. Jelani Nelson. Talk

As a theory talk, all points here should be considered suspect until verified as I’m not used to these sorts of talks and may misunderstand
  1. JL Lemma: Every set of n points in euclidian space can be embedded into O(eps^-2 logn) dim eucildian space so that all pairwise distances are preserved to a 1+eps factor (original dim is arbitrary – this only matters on number of points)
    1. Speeds geom algs by reducing dim of input
    2. Low mem streaming algs for lin algebra
    3. Similar to RIP matrices from compressed sensing
  2. Given an eps and delta, you can find a matrix that does a dim reduction while maintaining distances ?  Can show how large the matrix has to be.
  3. The embedding is normally expensive, this talk is about how to do it more efficiently
  4. The speedups are much better, O(eps^-2 logn log^4 d)  but slow to embed sparse vectors
  5. Speaker’s approach got costs down to O(eps^-1 log(1/delta)) for sparse vectors
  6. Recovery procedure isn’t applicable in all settings
  7. …lost

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: