In this chapter, a comprehensive methodology is presented to address important data-driven challenges within the context of classification. First, it is demonstrated that challenges, such as heterogeneity and noise observed with big/large data-sets, affect the efficiency of a deep neural network (DNN)-based classifiers. To obviate these issues, a two-step classification framework is introduced where unwanted attributes (variables) are systematically removed through a preprocessing step and a DNN-based classifier is introduced to address heterogeneity in the learning process. Specifically, a multi-stage nonlinear dimensionality reduction (NDR) approach is described in this chapter to remove unwanted variables and a novel optimization framewo...
The superior performance of deep learning (DL) in natural language processing and machine&...
The superior performance of deep learning (DL) in natural language processing and machine&...
this paper we investigate several ways of utilizing error-dependent resampling for artificial neural...
In this brief, heterogeneity and noise in big data are shown to increase the generalization error fo...
In this digital age, big-data sets are commonly found in the field of healthcare, manufacturing and ...
In this paper, generalization error for traditional learning regimes-based classification is demonst...
In this paper, a novel methodology to reduce the generalization errors occurring due to domain shift...
Deep Neural Networks ("deep learning") have become a ubiquitous choice of algorithms for Machine Lea...
The most of collected data samples from E-learning systems consist of correlated information caused ...
Improving the classification performance of Deep Neural Networks (DNN) is of primary interest in man...
Deep neural networks (DNNs) require large amounts of labeled data for model training. However, label...
Deep learning has achieved significant improvements in a variety of tasks in computer vision applica...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
We outline a differential theory of learning for statistical pattern classification. When applied to...
Supervised learning using deep convolutional neural network has shown its promise in large-scale ima...
The superior performance of deep learning (DL) in natural language processing and machine&...
The superior performance of deep learning (DL) in natural language processing and machine&...
this paper we investigate several ways of utilizing error-dependent resampling for artificial neural...
In this brief, heterogeneity and noise in big data are shown to increase the generalization error fo...
In this digital age, big-data sets are commonly found in the field of healthcare, manufacturing and ...
In this paper, generalization error for traditional learning regimes-based classification is demonst...
In this paper, a novel methodology to reduce the generalization errors occurring due to domain shift...
Deep Neural Networks ("deep learning") have become a ubiquitous choice of algorithms for Machine Lea...
The most of collected data samples from E-learning systems consist of correlated information caused ...
Improving the classification performance of Deep Neural Networks (DNN) is of primary interest in man...
Deep neural networks (DNNs) require large amounts of labeled data for model training. However, label...
Deep learning has achieved significant improvements in a variety of tasks in computer vision applica...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
We outline a differential theory of learning for statistical pattern classification. When applied to...
Supervised learning using deep convolutional neural network has shown its promise in large-scale ima...
The superior performance of deep learning (DL) in natural language processing and machine&...
The superior performance of deep learning (DL) in natural language processing and machine&...
this paper we investigate several ways of utilizing error-dependent resampling for artificial neural...