Data-driven methods—such as the estimation of primaries by sparse inversion suffer from the 'curse of dimensionality’ that leads to disproportional growth in computational and storage demands when moving to realistic 3D field data. To remove this fundamental impediment, we propose a dimensionality-reduction technique where the 'data matrix' is approximated adaptively by a randomized low-rank factorization. Compared to conventional methods, which need passes through all the data possibly including on-the-fly interpolations for each iteration, our approach has the advantage that the passes are reduced to one to three. In addition, the low-rank matrix factorization leads to considerable reductions in storage and computational costs of the matr...
We propose a novel method called sparse dimensionality reduction (SDR) in this paper. It performs di...
Various dimensionality reduction (DR) schemes have been developed for projecting high-dimensional da...
Learning tasks such as classification and clustering usually perform better and cost less (time and ...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
Many inverse problems in science and engineering involve multi-experiment data and thus require a la...
Generative dimensionality reduction methods play an important role in machine learning applications ...
Scalability of statistical estimators is of increasing importance in modern applications and dimensi...
Recently, a new approach to multiple removal has been intro- duced: estimation of primaries by spars...
Dimensionality reduction techniques are outlined; their strengths and limitations are discussed. The...
Today, the problem of surface-related multiples, especially in shallow water, is not fully solved. A...
Matrix factorization exploits the idea that, in complex high-dimensional data, the actual signal typ...
With electronic data increasing dramatically in almost all areas of research, a plethora of new tech...
Dimensionality reduction is a fundamental idea in data science and machine learning. Tensor is ubiqu...
Compressive sensing is an emerging field predicated upon the fact that, if a signal has a sparse rep...
Massive high-dimensional data sets are ubiquitous in all scientific disciplines. Extracting meaningf...
We propose a novel method called sparse dimensionality reduction (SDR) in this paper. It performs di...
Various dimensionality reduction (DR) schemes have been developed for projecting high-dimensional da...
Learning tasks such as classification and clustering usually perform better and cost less (time and ...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
Many inverse problems in science and engineering involve multi-experiment data and thus require a la...
Generative dimensionality reduction methods play an important role in machine learning applications ...
Scalability of statistical estimators is of increasing importance in modern applications and dimensi...
Recently, a new approach to multiple removal has been intro- duced: estimation of primaries by spars...
Dimensionality reduction techniques are outlined; their strengths and limitations are discussed. The...
Today, the problem of surface-related multiples, especially in shallow water, is not fully solved. A...
Matrix factorization exploits the idea that, in complex high-dimensional data, the actual signal typ...
With electronic data increasing dramatically in almost all areas of research, a plethora of new tech...
Dimensionality reduction is a fundamental idea in data science and machine learning. Tensor is ubiqu...
Compressive sensing is an emerging field predicated upon the fact that, if a signal has a sparse rep...
Massive high-dimensional data sets are ubiquitous in all scientific disciplines. Extracting meaningf...
We propose a novel method called sparse dimensionality reduction (SDR) in this paper. It performs di...
Various dimensionality reduction (DR) schemes have been developed for projecting high-dimensional da...
Learning tasks such as classification and clustering usually perform better and cost less (time and ...