30th AAAI Conference on Artificial Intelligence, AAAI 2016, Phoenix, US, 12-17 February 2016The restricted Boltzmann machine (RBM) has been used as building blocks for many successful deep learning models, e.g., deep belief networks (DBN) and deep Boltzmann machine (DBM) etc. The training of RBM can be extremely slow in pathological regions. The second order optimization methods, such as quasi-Newton methods, were proposed to deal with this problem. However, the non-convexity results in many obstructions for training RBM, including the infeasibility of applying second order optimization methods. In order to overcome this obstruction, we introduce an em-like iterative project quasi-Newton (IPQN) algorithm. Specifically, we iteratively perfor...
Four decades after their invention, quasi-Newton methods are still state of the art in unconstrained...
Like sentinels guarding a secret treasure, computationally difficult problems define the edge of wha...
We propose a fast second-order method that can be used as a drop-in replacement for current deep lea...
The restricted Boltzmann machine (RBM) has been used as building blocks for many successful deep lea...
While first-order methods are popular for solving optimization problems that arise in large-scale de...
The restricted Boltzmann machine (RBM) is one of the widely used basic models in the field of deep l...
In this dissertation, we are concerned with the advancement of optimization algorithms for training ...
Abstract. Restricted Boltzmann Machines (RBM) are energy-based models that are successfully used as ...
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Incorporating curvature information in stochastic methods has been a challenging task. This paper pr...
We extend the well-known BFGS quasi-Newton method and its memory-limited variant LBFGS to the optimi...
In this paper, we present a new class of quasi-Newton methods for the effective learning in large mu...
Restricted Boltzmann machine (RBM) plays an important role in current deep learning techniques, as m...
This paper presents a novel quasi-Newton method fo the minimization of the error function of a feed-...
The RBM is a stochastic energy-based model of an unsupervised neural network (RBM). RBM is a key pre...
Four decades after their invention, quasi-Newton methods are still state of the art in unconstrained...
Like sentinels guarding a secret treasure, computationally difficult problems define the edge of wha...
We propose a fast second-order method that can be used as a drop-in replacement for current deep lea...
The restricted Boltzmann machine (RBM) has been used as building blocks for many successful deep lea...
While first-order methods are popular for solving optimization problems that arise in large-scale de...
The restricted Boltzmann machine (RBM) is one of the widely used basic models in the field of deep l...
In this dissertation, we are concerned with the advancement of optimization algorithms for training ...
Abstract. Restricted Boltzmann Machines (RBM) are energy-based models that are successfully used as ...
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Incorporating curvature information in stochastic methods has been a challenging task. This paper pr...
We extend the well-known BFGS quasi-Newton method and its memory-limited variant LBFGS to the optimi...
In this paper, we present a new class of quasi-Newton methods for the effective learning in large mu...
Restricted Boltzmann machine (RBM) plays an important role in current deep learning techniques, as m...
This paper presents a novel quasi-Newton method fo the minimization of the error function of a feed-...
The RBM is a stochastic energy-based model of an unsupervised neural network (RBM). RBM is a key pre...
Four decades after their invention, quasi-Newton methods are still state of the art in unconstrained...
Like sentinels guarding a secret treasure, computationally difficult problems define the edge of wha...
We propose a fast second-order method that can be used as a drop-in replacement for current deep lea...