Statistical learning methods often embed the data in a latent space where the classification or regression task can be efficiently carried out. For several reasons such as interpretability, it is interesting to represent the results back into the input space. This so-called pre-image problem is a hard ill-posed problem, as one seeks to inverse implicit transformations in kernel-based machines, or multiple nonlinear embeddings in deep neural networks. In this paper, we propose a classification method that does not suffer from the pre-image problem, thanks to a novel class of generative neural networks called Normalizing Flows. Experiments show good results in classification and interpretability.Les méthodes d'apprentissage statistique génèr...
Data augmentation is a widely adopted technique for avoiding overfitting when training deep neural n...
Many application domains, spanning from low-level computer vision to medical imaging, require high-f...
Due to the success of generative flows to model data distributions, they have been explored in inver...
Statistical learning methods often embed the data in a latent space where the classification or regr...
In Machine Learning, data embedding is a fundamental aspect of creating nonlinear models. However, t...
Treball fi de màster de: Master's Degree in Data Science. Curs 2020-2021Directors: Vicenç Gómez (UPF...
The two key characteristics of a normalizing flow is that it is invertible (in particular, dimension...
The framework of normalizing flows provides a general strategy for flexible variational inference of...
Deep Learning is becoming a standard tool across science and industry to optimally solve a variety o...
Neural network models able to approximate and sample high-dimensional probability distributions are ...
National audienceDans cet article, nous proposons une approche permettant la résolution du problème ...
peer reviewedNormalizing flows model complex probability distributions by combining a base distribut...
The proliferation of kernel methods lies essentially on the kernel trick, which induces an implicit ...
Various normalization layers have been proposed to help the training of neural networks. Group Norma...
Normalizing flows have emerged as an important family of deep neural networks for modelling complex ...
Data augmentation is a widely adopted technique for avoiding overfitting when training deep neural n...
Many application domains, spanning from low-level computer vision to medical imaging, require high-f...
Due to the success of generative flows to model data distributions, they have been explored in inver...
Statistical learning methods often embed the data in a latent space where the classification or regr...
In Machine Learning, data embedding is a fundamental aspect of creating nonlinear models. However, t...
Treball fi de màster de: Master's Degree in Data Science. Curs 2020-2021Directors: Vicenç Gómez (UPF...
The two key characteristics of a normalizing flow is that it is invertible (in particular, dimension...
The framework of normalizing flows provides a general strategy for flexible variational inference of...
Deep Learning is becoming a standard tool across science and industry to optimally solve a variety o...
Neural network models able to approximate and sample high-dimensional probability distributions are ...
National audienceDans cet article, nous proposons une approche permettant la résolution du problème ...
peer reviewedNormalizing flows model complex probability distributions by combining a base distribut...
The proliferation of kernel methods lies essentially on the kernel trick, which induces an implicit ...
Various normalization layers have been proposed to help the training of neural networks. Group Norma...
Normalizing flows have emerged as an important family of deep neural networks for modelling complex ...
Data augmentation is a widely adopted technique for avoiding overfitting when training deep neural n...
Many application domains, spanning from low-level computer vision to medical imaging, require high-f...
Due to the success of generative flows to model data distributions, they have been explored in inver...