Morteza Monemizadeh - Academia.edu (original) (raw)

Uploads

Papers by Morteza Monemizadeh

Research paper thumbnail of Coresets and Sketches for High Dimensional Subspace Approximation Problems

Proceedings of the Twenty-First Annual ACM-SIAM Symposium on Discrete Algorithms, 2010

We consider the problem of approximating a set P of n points in R d by a j-dimensional subspace u... more We consider the problem of approximating a set P of n points in R d by a j-dimensional subspace under the p measure, in which we wish to minimize the sum of p distances from each point of P to this subspace. More generally, the Fq(p)-subspace approximation problem asks for a j-subspace that minimizes the sum of qth powers of p-distances to this subspace, up to a multiplicative factor of (1 +). We develop techniques for subspace approximation, regression, and matrix approximation that can be used to deal with massive data sets in high dimensional spaces. In particular, we develop coresets and sketches, i.e. small space representations that approximate the input point set P with respect to the subspace approximation problem. Our results are: • A dimensionality reduction method that can be applied to Fq(p)-clustering and shape fitting problems, such as those in [8, 15]. • The first strong coreset for F 1 (2)-subspace approximation in high-dimensional spaces, i.e. of size polynomial in the dimension of the space. This coreset approximates the distances to any j-subspace (not just the optimal one). • A (1 +)-approximation algorithm for the j-dimensional F 1 (2)-subspace approximation problem with running time nd(j/) O(1) + (n + d)2 poly(j/). • A streaming algorithm that maintains a coreset for the F 1 (2)-subspace approximation problem and uses a space of d 2 √ log n 2 poly(j) (weighted) points. • Streaming algorithms for the above problems with bounded precision in the turnstile model, i.e, when coordinates appear in an arbitrary order and undergo multiple updates. We show that bounded precision can lead to further improvements. We extend results of [7] for approximate linear regression, distances to subspace approximation, and optimal rank-j approximation, to error measures other than the Frobenius norm.

Research paper thumbnail of Coresets and Sketches for High Dimensional Subspace Approximation Problems

Proceedings of the Twenty-First Annual ACM-SIAM Symposium on Discrete Algorithms, 2010

We consider the problem of approximating a set P of n points in R d by a j-dimensional subspace u... more We consider the problem of approximating a set P of n points in R d by a j-dimensional subspace under the p measure, in which we wish to minimize the sum of p distances from each point of P to this subspace. More generally, the Fq(p)-subspace approximation problem asks for a j-subspace that minimizes the sum of qth powers of p-distances to this subspace, up to a multiplicative factor of (1 +). We develop techniques for subspace approximation, regression, and matrix approximation that can be used to deal with massive data sets in high dimensional spaces. In particular, we develop coresets and sketches, i.e. small space representations that approximate the input point set P with respect to the subspace approximation problem. Our results are: • A dimensionality reduction method that can be applied to Fq(p)-clustering and shape fitting problems, such as those in [8, 15]. • The first strong coreset for F 1 (2)-subspace approximation in high-dimensional spaces, i.e. of size polynomial in the dimension of the space. This coreset approximates the distances to any j-subspace (not just the optimal one). • A (1 +)-approximation algorithm for the j-dimensional F 1 (2)-subspace approximation problem with running time nd(j/) O(1) + (n + d)2 poly(j/). • A streaming algorithm that maintains a coreset for the F 1 (2)-subspace approximation problem and uses a space of d 2 √ log n 2 poly(j) (weighted) points. • Streaming algorithms for the above problems with bounded precision in the turnstile model, i.e, when coordinates appear in an arbitrary order and undergo multiple updates. We show that bounded precision can lead to further improvements. We extend results of [7] for approximate linear regression, distances to subspace approximation, and optimal rank-j approximation, to error measures other than the Frobenius norm.