It is well known that tensor network regression models operate on an exponentially large feature space, but questions remain as to how effectively they are able to utilize this space. Using a polynomial featurization, we propose the interaction decomposition as a tool that can assess the relative importance of different regressors as a function of their polynomial degree. We apply this decomposition to tensor ring and tree tensor network models trained on the MNIST and Fashion MNIST datasets, and find that up to 75% of interaction degrees are contributing meaningfully to these models. We also introduce a new type of tensor network model that is explicitly trained on only a small subset of interaction degrees, and find that these models are ...
Over-parametrization of deep neural networks has recently been shown to be key to their successful t...
Convolutional neural networks typically consist of many convolutional layers followed by one or more...
Modeling the joint distribution of high-dimensional data is a central task in unsupervised machine l...
A tensor network is a type of decomposition used to express and approximate large arrays of data. A ...
We propose a nonnegative tensor decomposition with focusing on the relationship between the modes of...
Modern applications in engineering and data science are increasingly based on multidimensional data ...
In the context of kernel machines, polynomial and Fourier features are commonly used to provide a no...
Low-rank tensor approximation approaches have become an important tool in the scientific computing c...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
States of quantum many-body systems are defined in a high-dimensional Hilbert space, where rich and ...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
Learning machines for structured data (e.g., trees) are intrinsically based on their capacity to lea...
Learning machines for structured data (e.g., trees) are intrinsically based on their capacity to lea...
Learning machines for structured data (e.g., trees) are intrinsically based on their capacity to lea...
Over-parametrization of deep neural networks has recently been shown to be key to their successful t...
Over-parametrization of deep neural networks has recently been shown to be key to their successful t...
Convolutional neural networks typically consist of many convolutional layers followed by one or more...
Modeling the joint distribution of high-dimensional data is a central task in unsupervised machine l...
A tensor network is a type of decomposition used to express and approximate large arrays of data. A ...
We propose a nonnegative tensor decomposition with focusing on the relationship between the modes of...
Modern applications in engineering and data science are increasingly based on multidimensional data ...
In the context of kernel machines, polynomial and Fourier features are commonly used to provide a no...
Low-rank tensor approximation approaches have become an important tool in the scientific computing c...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
States of quantum many-body systems are defined in a high-dimensional Hilbert space, where rich and ...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
Learning machines for structured data (e.g., trees) are intrinsically based on their capacity to lea...
Learning machines for structured data (e.g., trees) are intrinsically based on their capacity to lea...
Learning machines for structured data (e.g., trees) are intrinsically based on their capacity to lea...
Over-parametrization of deep neural networks has recently been shown to be key to their successful t...
Over-parametrization of deep neural networks has recently been shown to be key to their successful t...
Convolutional neural networks typically consist of many convolutional layers followed by one or more...
Modeling the joint distribution of high-dimensional data is a central task in unsupervised machine l...