International audienceMetamodeling, the science of modeling functions observed at a finite number of points, benefits from all auxiliary information it can account for. Function gradients are a common auxiliary information and are useful for predicting functions with locally changing behaviors. This article is a review of the main metamodels that use function gradients in addition to function values. The goal of the article is to give the reader both an overview of the principles involved in gradient-enhanced metamodels while also providing insightful formulations. The following metamodels have gradient-enhanced versions in the literature and are reviewed here: classical, weighted and moving least squares, Shepard weighting functions, and t...
The theory of gradient analysis is presented in this chapter, in which the heuristic techniques are ...
In this work we deal with the problem of metalearning for kernel based methods. Among the kernel met...
In regression problems over ℝd, the unknown function f often varies more in some coordinates than in...
International audienceMetamodeling, the science of modeling functions observed at a finite number of...
International audienceIn the context of optimization, derivatives of the objective function or the c...
In order to increase the efficiency of design optimization, many efforts have been made on studying ...
Radial basis functions (RBFs), among other techniques, are used to construct metamodels that approxi...
Gradient boosting machines are a family of powerful machine-learning techniques that have shown cons...
Gradient-based Meta-RL (GMRL) refers to methods that maintain two-level optimisation procedures wher...
Stochastic kriging is a new metamodeling technique for effectively representing the mean response su...
One approach for interpreting black-box machine learning models is to find a global approximation of...
Metamodelling offers an efficient way to imitate the behaviour of computationally expensive simulato...
This papers aims at presenting a review of some metamodels used in optimization. We are interested i...
<p>In deterministic computer experiments, it is often known that the output is a monotonic function ...
237 pagesIt seems that in the current age, computers, computation, and data have an increasingly imp...
The theory of gradient analysis is presented in this chapter, in which the heuristic techniques are ...
In this work we deal with the problem of metalearning for kernel based methods. Among the kernel met...
In regression problems over ℝd, the unknown function f often varies more in some coordinates than in...
International audienceMetamodeling, the science of modeling functions observed at a finite number of...
International audienceIn the context of optimization, derivatives of the objective function or the c...
In order to increase the efficiency of design optimization, many efforts have been made on studying ...
Radial basis functions (RBFs), among other techniques, are used to construct metamodels that approxi...
Gradient boosting machines are a family of powerful machine-learning techniques that have shown cons...
Gradient-based Meta-RL (GMRL) refers to methods that maintain two-level optimisation procedures wher...
Stochastic kriging is a new metamodeling technique for effectively representing the mean response su...
One approach for interpreting black-box machine learning models is to find a global approximation of...
Metamodelling offers an efficient way to imitate the behaviour of computationally expensive simulato...
This papers aims at presenting a review of some metamodels used in optimization. We are interested i...
<p>In deterministic computer experiments, it is often known that the output is a monotonic function ...
237 pagesIt seems that in the current age, computers, computation, and data have an increasingly imp...
The theory of gradient analysis is presented in this chapter, in which the heuristic techniques are ...
In this work we deal with the problem of metalearning for kernel based methods. Among the kernel met...
In regression problems over ℝd, the unknown function f often varies more in some coordinates than in...