Treballs finals del Màster en Matemàtica Avançada, Facultat de matemàtiques, Universitat de Barcelona, Any: 2018, Director: Jesús Cerquides Bueno[en] Function optimization is a widely faced problem nowadays. Its interest, in particular, lies in every learning algorithm in AI, whose achievements are measured by a Loss-Function. On one hand, Multinomial Logistic Regression is a commonly applied model to engage and simplify the problem of predicting a categorical distributed variable which depends on a set of distinct categorical distributed variables. On the other hand, Gradient Descent allows us to reach local extrema of a smooth function. Moreover, large datasets force the use of online optimization. Improving the convergence speed and re...
This work considers optimization methods for large-scale machine learning (ML). Optimization in ML ...
Current machine learning practice requires solving huge-scale empirical risk minimization problems q...
Stochastic Gradient Descent (SGD) Algorithm, despite its simplicity, is considered an effective and ...
Programa de doctorat en Matemàtica i Informàtica / Tesi realitzada a l'Institut d'Investigació en In...
Gradient-based algorithms are popular when solving unconstrained optimization problems. By exploitin...
The notable changes over the current version: - worked example of convergence rates showing SAG can ...
The goal of this paper is to debunk and dispel the magic behind black-box optimizers and stochastic ...
International audienceIn this article, we propose a new method for multiobjective optimization probl...
Modern machine learning models are complex, hierarchical, and large-scale and are trained using non-...
where the function f or its gradient rf are not directly accessible except through Monte Carlo estim...
abstract: This thesis presents a family of adaptive curvature methods for gradient-based stochastic ...
237 pagesIt seems that in the current age, computers, computation, and data have an increasingly imp...
Despite the recent growth of theoretical studies and empirical successes of neural networks, gradien...
AbstractIn this paper, a stochastic gradient descent algorithm is proposed for the binary classifica...
In this project a stochastic method for general purpose optimization and machine learning is describ...
This work considers optimization methods for large-scale machine learning (ML). Optimization in ML ...
Current machine learning practice requires solving huge-scale empirical risk minimization problems q...
Stochastic Gradient Descent (SGD) Algorithm, despite its simplicity, is considered an effective and ...
Programa de doctorat en Matemàtica i Informàtica / Tesi realitzada a l'Institut d'Investigació en In...
Gradient-based algorithms are popular when solving unconstrained optimization problems. By exploitin...
The notable changes over the current version: - worked example of convergence rates showing SAG can ...
The goal of this paper is to debunk and dispel the magic behind black-box optimizers and stochastic ...
International audienceIn this article, we propose a new method for multiobjective optimization probl...
Modern machine learning models are complex, hierarchical, and large-scale and are trained using non-...
where the function f or its gradient rf are not directly accessible except through Monte Carlo estim...
abstract: This thesis presents a family of adaptive curvature methods for gradient-based stochastic ...
237 pagesIt seems that in the current age, computers, computation, and data have an increasingly imp...
Despite the recent growth of theoretical studies and empirical successes of neural networks, gradien...
AbstractIn this paper, a stochastic gradient descent algorithm is proposed for the binary classifica...
In this project a stochastic method for general purpose optimization and machine learning is describ...
This work considers optimization methods for large-scale machine learning (ML). Optimization in ML ...
Current machine learning practice requires solving huge-scale empirical risk minimization problems q...
Stochastic Gradient Descent (SGD) Algorithm, despite its simplicity, is considered an effective and ...