We propose a novel approach to ranking Deep Learning (DL) hyper-parameters through the application of Sensitivity Analysis (SA). DL hyper-parameters play an important role in model accuracy however, choosing optimal values for each parameter can be time and resource-intensive. A better understanding of the importance of parameters in relation to data and model architecture would benefit DL research. SA provides a quantitative measure by which hyper-parameters can be ranked in terms of their contribution to model accuracy allowing comparisons to be made across datasets and architecture. The results showed the importance of optimal architecture with activation function being highly influential. The influence of learning rate decay was ranked ...
Deep Neural Networks have advanced rapidly over the past several years. However, it still seems like...
The recent success of large and deep neural network models has motivated the training of even larger...
The impact of learning algorithm optimization by means of parameter tuning is studied. To do this, t...
Recent research has found that deep learning architectures show significant improvements over tradit...
International audienceTackling new machine learning problems with neural networks always means optim...
Classical sensitivity analysis of machine learning regression models is a topic sparse in literature...
Deep neural networks (DNNs) have achieved superior performance in various prediction tasks, but can ...
The behaviors of deep neural networks (DNNs) are notoriously resistant to human interpretations. In ...
Deep learning is proving to be a useful tool in solving problems from various domains. Despite a ric...
Hyperparameters in machine learning (ML) have received a fair amount of attention, and hyperparamete...
In a deep learning model, the effect of the model may vary depending on the setting of the hyperpara...
Thesis (Master's)--University of Washington, 2021Carefully crafted input has been shown to cause mis...
Explainable artificial intelligence (XAI) has shed light on enormous applications by clarifying why ...
Publisher Copyright: © 2013 IEEE.Explainable artificial intelligence (XAI) has shed light on enormou...
Although recent works have brought some insights into the performance improvement of techniques used...
Deep Neural Networks have advanced rapidly over the past several years. However, it still seems like...
The recent success of large and deep neural network models has motivated the training of even larger...
The impact of learning algorithm optimization by means of parameter tuning is studied. To do this, t...
Recent research has found that deep learning architectures show significant improvements over tradit...
International audienceTackling new machine learning problems with neural networks always means optim...
Classical sensitivity analysis of machine learning regression models is a topic sparse in literature...
Deep neural networks (DNNs) have achieved superior performance in various prediction tasks, but can ...
The behaviors of deep neural networks (DNNs) are notoriously resistant to human interpretations. In ...
Deep learning is proving to be a useful tool in solving problems from various domains. Despite a ric...
Hyperparameters in machine learning (ML) have received a fair amount of attention, and hyperparamete...
In a deep learning model, the effect of the model may vary depending on the setting of the hyperpara...
Thesis (Master's)--University of Washington, 2021Carefully crafted input has been shown to cause mis...
Explainable artificial intelligence (XAI) has shed light on enormous applications by clarifying why ...
Publisher Copyright: © 2013 IEEE.Explainable artificial intelligence (XAI) has shed light on enormou...
Although recent works have brought some insights into the performance improvement of techniques used...
Deep Neural Networks have advanced rapidly over the past several years. However, it still seems like...
The recent success of large and deep neural network models has motivated the training of even larger...
The impact of learning algorithm optimization by means of parameter tuning is studied. To do this, t...