Random Projections
This is one of the beautiful ideas that came across recently while trying to look for something that works well for dimensionality reduction. It is a very simple technique yet works beautifully. It is as follows-
Reduce the given data matrix Xd x N = R k x dX d x N
The core idea behind the entire field is that of Johnson Lindenstrauss Lemma that says that the points in the high dimensional space can be mapped into a low dimensional space that preserves the original structure with a high probability.
The matrix R k x d is the random projection that reduces the data into k dimensional subspace. The initial result showed that R can be generated from a Gaussian distribution consisting of k orthogonal random vectors. Later on, it was shown that the Gaussian distribution can be replaced with a much simpler distribution such as generate +1,-1 with probability 1/2 or even generate 0, -1 and +1 with a probability 2/3, 1/6 and 1/6 respectively . This translates the multiplication into a summation problem (The zero helps in the arithmetic reduction even further). This result looks so mysterious and yet is so powerful.
Useful Links:
Random projection in dimensionality reduction
Database friendly random projections
Useful Links:
Random projection in dimensionality reduction
Database friendly random projections
Comments
Post a Comment