Publication Details
Title: Spectral Gap Error Bounds for Improving CUR Matrix Decomposition and the Nystrom Method
Author: D. G. Anderson, S. S. Du, M. W. Mahoney, C. Melgaard, K. Wu, and M. Gu
Bibliographic Information: Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (AI & Statistics 2015), San Diego, California
Date: May 2015
Research Area: Research Initiatives
Type: Article in conference proceedings
PDF: https://www.icsi.berkeley.edu/pubs/initiatives/spectralgap15.pdf
Overview:
The CUR matrix decomposition and the related Nystrom method build low-rank approximations of data matrices by selecting a small number of representative rows and columns of the data. Here, we introduce novel spectral gap error bounds that judiciously exploit the potentially rapid spectrum decay in the input matrix, a most common occurrence in machine learning and data analysis. Our error bounds are much tighter than existing ones for matrices with rapid spectrum decay, and they justify the use of a constant amount of over-sampling relative to the rank parameter k, i.e, when the number of columns/rows is l = k + O(1). We demonstrate our analysis on a novel deterministic algorithm, StableCUR, which additionally eliminates a previously unrecognized source of potential instability in CUR decompositions. While our algorithm accepts any method of row and column selection, we implement it with a recent column selection scheme with strong singular value bounds. Empirical results on various classes of real world data matrices demonstrate that our algorithm is as effcient as, and often outperforms, competing algorithms.
Bibliographic Reference:
D. G. Anderson, S. S. Du, M. W. Mahoney, C. Melgaard, K. Wu, and M. Gu. Spectral Gap Error Bounds for Improving CUR Matrix Decomposition and the Nystrom Method. Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (AI & Statistics 2015), San Diego, California, May 2015
Author: D. G. Anderson, S. S. Du, M. W. Mahoney, C. Melgaard, K. Wu, and M. Gu
Bibliographic Information: Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (AI & Statistics 2015), San Diego, California
Date: May 2015
Research Area: Research Initiatives
Type: Article in conference proceedings
PDF: https://www.icsi.berkeley.edu/pubs/initiatives/spectralgap15.pdf
Overview:
The CUR matrix decomposition and the related Nystrom method build low-rank approximations of data matrices by selecting a small number of representative rows and columns of the data. Here, we introduce novel spectral gap error bounds that judiciously exploit the potentially rapid spectrum decay in the input matrix, a most common occurrence in machine learning and data analysis. Our error bounds are much tighter than existing ones for matrices with rapid spectrum decay, and they justify the use of a constant amount of over-sampling relative to the rank parameter k, i.e, when the number of columns/rows is l = k + O(1). We demonstrate our analysis on a novel deterministic algorithm, StableCUR, which additionally eliminates a previously unrecognized source of potential instability in CUR decompositions. While our algorithm accepts any method of row and column selection, we implement it with a recent column selection scheme with strong singular value bounds. Empirical results on various classes of real world data matrices demonstrate that our algorithm is as effcient as, and often outperforms, competing algorithms.
Bibliographic Reference:
D. G. Anderson, S. S. Du, M. W. Mahoney, C. Melgaard, K. Wu, and M. Gu. Spectral Gap Error Bounds for Improving CUR Matrix Decomposition and the Nystrom Method. Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (AI & Statistics 2015), San Diego, California, May 2015