Multi-task learning seeks to improve the generalization performance by sharing common information among multiple related tasks. A key assumption in most MTL algorithms is that all tasks are related, which, however, may not hold in many real-world applications. Existing techniques, which attempt to address this issue, aim to identify groups of related tasks using group sparsity. In this paper, we propose a probabilistic tree sparsity (PTS) model to utilize the tree structure to obtain the sparse solution instead of the group structure. Specifically, each model coefficient in the learning model is decomposed into a product of multiple component coefficients each of which corresponds to a node in the tree. Based on the decomposition, Gaussian ...
In this paper, we formulate human action recognition as a novel Multi-Task Sparse Learning(MTSL) fra...
Multi-task learning has shown to significantly enhance the perfor-mance of multiple related learning...
This paper investigates a new learning formula-tion called structured sparsity, which is a natu-ral ...
We consider the problem of learning a sparse multi-task regression, where the structure in the outpu...
Multi-task learning (MTL) aims to improve generalization performance by learning multiple related ta...
We consider the problem of learning a sparse multi-task regression, where the structure in the outpu...
Sparsity plays an essential role in a number of modern algorithms. This thesis examines how we can i...
For many real-world machine learning applications, labeled data is costly because the data labeling ...
National audienceRecently, there has been a lot of interest around multi-task learning (MTL) problem...
We propose a framework MIC (Multiple Inclusion Criterion) for learning sparse models based on the in...
In multi-task learning (MTL), multiple related tasks are learned jointly by sharing information acro...
Multi-task learning (MTL) aims to improve generalization performance by learning multiple related ta...
In this paper, we consider the problem of learning Gaussian multiresolution (MR) models in which dat...
Multi-label classification is a critical problem in many areas of data analysis such as image labeli...
With the increasing availability of large datasets machine learning techniques are becoming an incr...
In this paper, we formulate human action recognition as a novel Multi-Task Sparse Learning(MTSL) fra...
Multi-task learning has shown to significantly enhance the perfor-mance of multiple related learning...
This paper investigates a new learning formula-tion called structured sparsity, which is a natu-ral ...
We consider the problem of learning a sparse multi-task regression, where the structure in the outpu...
Multi-task learning (MTL) aims to improve generalization performance by learning multiple related ta...
We consider the problem of learning a sparse multi-task regression, where the structure in the outpu...
Sparsity plays an essential role in a number of modern algorithms. This thesis examines how we can i...
For many real-world machine learning applications, labeled data is costly because the data labeling ...
National audienceRecently, there has been a lot of interest around multi-task learning (MTL) problem...
We propose a framework MIC (Multiple Inclusion Criterion) for learning sparse models based on the in...
In multi-task learning (MTL), multiple related tasks are learned jointly by sharing information acro...
Multi-task learning (MTL) aims to improve generalization performance by learning multiple related ta...
In this paper, we consider the problem of learning Gaussian multiresolution (MR) models in which dat...
Multi-label classification is a critical problem in many areas of data analysis such as image labeli...
With the increasing availability of large datasets machine learning techniques are becoming an incr...
In this paper, we formulate human action recognition as a novel Multi-Task Sparse Learning(MTSL) fra...
Multi-task learning has shown to significantly enhance the perfor-mance of multiple related learning...
This paper investigates a new learning formula-tion called structured sparsity, which is a natu-ral ...