In this paper, we consider gradient methods for minimizing smooth convex functions, which employ the information obtained at the previous iterations in order to accelerate the convergence towards the optimal solution. This information is used in the form of piece-wise linear model of the objective function, which provides us with much better prediction abilities as compared with the standard linear model. To the best of our knowledge, this approach was never really applied in Convex Minimization to differentiable functions in view of the high complexity of the corresponding auxiliary problems. However, we show that all necessary computations can be done very efficiently. Consequently, we get new optimization methods, which are better than t...
Mini-batch algorithms have been proposed as a way to speed-up stochastic convex optimization problem...
In this paper we propose a new approach for constructing efficient schemes for nonsmooth convex opti...
Gradient boosting is a state-of-the-art prediction technique that sequentially produces a model in t...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
The recently introduced Gradient Methods with Memory use a subset of the past oracle information to ...
In this paper we analyze several new methods for solving optimization problems with the objective fu...
We consider optimization methods for convex minimization problems under inexact information on the o...
Data-rich applications in machine-learning and control have motivated an intense research on large-s...
Data-rich applications in machine-learning and control have motivated an intense research on large-s...
We consider the problem of minimizing a smooth convex objective function subject to the set of minim...
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
Mini-batch algorithms have been proposed as a way to speed-up stochastic convex optimization problem...
In this paper we propose a new approach for constructing efficient schemes for nonsmooth convex opti...
Gradient boosting is a state-of-the-art prediction technique that sequentially produces a model in t...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
The recently introduced Gradient Methods with Memory use a subset of the past oracle information to ...
In this paper we analyze several new methods for solving optimization problems with the objective fu...
We consider optimization methods for convex minimization problems under inexact information on the o...
Data-rich applications in machine-learning and control have motivated an intense research on large-s...
Data-rich applications in machine-learning and control have motivated an intense research on large-s...
We consider the problem of minimizing a smooth convex objective function subject to the set of minim...
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
Mini-batch algorithms have been proposed as a way to speed-up stochastic convex optimization problem...
In this paper we propose a new approach for constructing efficient schemes for nonsmooth convex opti...
Gradient boosting is a state-of-the-art prediction technique that sequentially produces a model in t...