Parameter estimation in empirical fields is usually undertaken using parametric models, and such models readily facilitate statistical inference. Unfortunately, they are unlikely to be sufficiently flexible to be able to adequately model real-world phenomena, and may yield biased estimates. Conversely, non-parametric approaches are flexible but do not readily facilitate statistical inference and may still exhibit residual bias. We explore the potential for Influence Functions (IFs) to (a) improve initial estimators without needing more data (b) increase model robustness and (c) facilitate statistical inference. We begin with a broad introduction to IFs, and propose a neural network method 'MultiNet', which seeks the diversity of an ensemble...
Can we learn the influence of a set of people in a social network from cascades of information diffu...
The chapters of this dissertation explore the theoretical and empirical potential of neural networks...
In terms of the Bias/Variance decomposition, very flexible (i.e., complex) Supervised Machine Learni...
Understanding the black-box prediction for neural networks is challenging. To achieve this, early st...
Evaluation of treatment effects and more general estimands is typically achieved via parametric mode...
Evaluation of treatment effects and more general estimands is typically achieved via parametric mode...
We propose and analyze estimators for statistical functionals of one or more distributions under non...
International audienceIn this paper, we tackle the problem of finding potentially problematic sample...
We establish PAC learnability of influence functions for three common influence models, namely, the ...
Can we learn the influence of a set of people in a social network from cascades of informa-tion diff...
There are many economic parameters that depend on nonparametric first steps. Examples include games,...
A mixed order associative neural network with n neurons and a modified Hebbian learning rule can lea...
In the statistics and machine learning communities, there exists a perceived dichotomy be- tween sta...
Neural Networks (NN) have demonstrated remarkable time series fitting and prediction abilities, outp...
Feedforward neural networks trained by error backpropagation are examples of nonparametric regressio...
Can we learn the influence of a set of people in a social network from cascades of information diffu...
The chapters of this dissertation explore the theoretical and empirical potential of neural networks...
In terms of the Bias/Variance decomposition, very flexible (i.e., complex) Supervised Machine Learni...
Understanding the black-box prediction for neural networks is challenging. To achieve this, early st...
Evaluation of treatment effects and more general estimands is typically achieved via parametric mode...
Evaluation of treatment effects and more general estimands is typically achieved via parametric mode...
We propose and analyze estimators for statistical functionals of one or more distributions under non...
International audienceIn this paper, we tackle the problem of finding potentially problematic sample...
We establish PAC learnability of influence functions for three common influence models, namely, the ...
Can we learn the influence of a set of people in a social network from cascades of informa-tion diff...
There are many economic parameters that depend on nonparametric first steps. Examples include games,...
A mixed order associative neural network with n neurons and a modified Hebbian learning rule can lea...
In the statistics and machine learning communities, there exists a perceived dichotomy be- tween sta...
Neural Networks (NN) have demonstrated remarkable time series fitting and prediction abilities, outp...
Feedforward neural networks trained by error backpropagation are examples of nonparametric regressio...
Can we learn the influence of a set of people in a social network from cascades of information diffu...
The chapters of this dissertation explore the theoretical and empirical potential of neural networks...
In terms of the Bias/Variance decomposition, very flexible (i.e., complex) Supervised Machine Learni...