The result that for quadratic functions the classical steepest descent algorithm in R-d converges locally to a two-point attractor was proved by Akaike. In this paper this result is proved for bounded quadratic operators in Hilbert space. The asymptotic rate of convergence is shown to depend on the starting point while, as expected, confirming the Kantorovich bounds. The introduction of a relaxation coefficient in the steepest-descent algorithm completely changes its behaviour, which may become chaotic. Different attractors are presented. We show that relaxation allows a significantly improved rate of convergence
Let E be a real Banach space and let A: E → E be a Lipschitzian generalized strongly accretive opera...
The initial value problem for an integrable system, such as the Nonlinear Schrödinger equation, is s...
We study the convergence of a random iterative sequence of a family of operators on infinite dimensi...
The original publication is available at www.springerlink.comInternational audienceThe asymptotic be...
We consider the special case of the restarted Arnoldi method for approximating the product of a func...
We are interested in the asymptotic behavior of the trajectories of the famous steepest descent evol...
ABSTRACT: We review the history of the nonlinear steepest descent method for the asymptotic evaluati...
We propose a family of gradient algorithms for minimizing a quadratic function f(x)=(Ax,x)/2−(x,y) i...
This work was also published as a Rice University thesis/dissertation: http://hdl.handle.net/1911/1...
Let H1 H2 be Hilbert spaces, T a bounded linear operator on H1 into H2 such that the range of T, R (...
AbstractLet E be a real normed linear space and let A: E↦2E be a bounded uniformly continuous φ-stro...
In this paper we study a family of gradient descent algorithms to approximate the regression functio...
The initial value problem for an integrable system, such as the Nonlinear Schrodinger equation, is s...
In recent years, it has become increasingly clear that the critical issue in gradient methods is the...
We analyse convergence rates of Smolyak integration for parametric maps u: U → X taking values in a ...
Let E be a real Banach space and let A: E → E be a Lipschitzian generalized strongly accretive opera...
The initial value problem for an integrable system, such as the Nonlinear Schrödinger equation, is s...
We study the convergence of a random iterative sequence of a family of operators on infinite dimensi...
The original publication is available at www.springerlink.comInternational audienceThe asymptotic be...
We consider the special case of the restarted Arnoldi method for approximating the product of a func...
We are interested in the asymptotic behavior of the trajectories of the famous steepest descent evol...
ABSTRACT: We review the history of the nonlinear steepest descent method for the asymptotic evaluati...
We propose a family of gradient algorithms for minimizing a quadratic function f(x)=(Ax,x)/2−(x,y) i...
This work was also published as a Rice University thesis/dissertation: http://hdl.handle.net/1911/1...
Let H1 H2 be Hilbert spaces, T a bounded linear operator on H1 into H2 such that the range of T, R (...
AbstractLet E be a real normed linear space and let A: E↦2E be a bounded uniformly continuous φ-stro...
In this paper we study a family of gradient descent algorithms to approximate the regression functio...
The initial value problem for an integrable system, such as the Nonlinear Schrodinger equation, is s...
In recent years, it has become increasingly clear that the critical issue in gradient methods is the...
We analyse convergence rates of Smolyak integration for parametric maps u: U → X taking values in a ...
Let E be a real Banach space and let A: E → E be a Lipschitzian generalized strongly accretive opera...
The initial value problem for an integrable system, such as the Nonlinear Schrödinger equation, is s...
We study the convergence of a random iterative sequence of a family of operators on infinite dimensi...