Meta-learning can successfully acquire useful inductive biases from data. Yet, its generalization properties to unseen learning tasks are poorly understood. Particularly if the number of meta-training tasks is small, this raises concerns about overfitting. We provide a theoretical analysis using the PAC-Bayesian framework and derive novel generalization bounds for meta-learning. Using these bounds, we develop a class of PAC-optimal meta-learning algorithms with performance guarantees and a principled meta-level regularization. Unlike previous PAC-Bayesian meta-learners, our method results in a standard stochastic optimization problem which can be solved efficiently and scales well. When instantiating our PAC-optimal hyper-posterior (PACOH) ...
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability o...
Meta-Learning promises to enable more data-efficient inference by harnessing previous experience fro...
Meta-learning aims to leverage experience from previous tasks to achieve an effective and fast adapt...
Meta-learning automatically infers an inductive bias by observing data from a number of related task...
Recent papers have demonstrated that both predicate invention and the learning of recursion can be e...
All authors contributed equally to this work. We propose a PAC-Bayesian analysis of the transductive...
Abstract. Recent papers have demonstrated that both predicate invention and the learning of recursio...
We introduce a new, rigorously-formulated Bayesian meta-learning algorithm that learns a probability...
Generalised Bayesian learning algorithms are increasingly popular in machine learning, due to their ...
This paper introduces a new framework for data efficient and versatile learning. Specifically: 1) We...
Obtaining reliable, adaptive confidence sets for prediction functions (hypotheses) is a central chal...
Meta-learning owns unique effectiveness and swiftness in tackling emerging tasks with limited data. ...
In machine learning, Domain Adaptation (DA) arises when the distribution gen-erating the test (targe...
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability o...
Meta-Learning promises to enable more data-efficient inference by harnessing previous experience fro...
Meta-learning aims to leverage experience from previous tasks to achieve an effective and fast adapt...
Meta-learning automatically infers an inductive bias by observing data from a number of related task...
Recent papers have demonstrated that both predicate invention and the learning of recursion can be e...
All authors contributed equally to this work. We propose a PAC-Bayesian analysis of the transductive...
Abstract. Recent papers have demonstrated that both predicate invention and the learning of recursio...
We introduce a new, rigorously-formulated Bayesian meta-learning algorithm that learns a probability...
Generalised Bayesian learning algorithms are increasingly popular in machine learning, due to their ...
This paper introduces a new framework for data efficient and versatile learning. Specifically: 1) We...
Obtaining reliable, adaptive confidence sets for prediction functions (hypotheses) is a central chal...
Meta-learning owns unique effectiveness and swiftness in tackling emerging tasks with limited data. ...
In machine learning, Domain Adaptation (DA) arises when the distribution gen-erating the test (targe...
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability o...