While neural network binary classifiers are often evaluated on metrics such as Accuracy and $F_1$-Score, they are commonly trained with a cross-entropy objective. How can this training-evaluation gap be addressed? While specific techniques have been adopted to optimize certain confusion matrix based metrics, it is challenging or impossible in some cases to generalize the techniques to other metrics. Adversarial learning approaches have also been proposed to optimize networks via confusion matrix based metrics, but they tend to be much slower than common training methods. In this work, we propose a unifying approach to training neural network binary classifiers that combines a differentiable approximation of the Heaviside function with a pro...
Though deep learning has been applied successfully in many scenarios, malicious inputs with human-im...
Deep neural networks have achieved great success on many tasks and even surpass human performance in...
Thesis (Ph.D.)--University of Washington, 2020The recent renaissance of deep neural networks has lea...
This paper presents an ensemble neural network and interval neutrosophic sets approach to the proble...
We propose an algorithm improvement for classifying machine learning algorithms with the fuzzificat...
How to train a binary neural network (BinaryNet) with both high compression rate and high accuracy o...
The state-of-the art machine learning approach to training deep neural networks, backpropagation, is...
Sample complexity results from computational learning theory, when applied to neural network learnin...
Neural network binarization accelerates deep models by quantizing their weights and activations into...
We present a method to train self-binarizing neural networks, that is, networks that evolve their we...
Deep neural networks (DNNs) are notorious for making more mistakes for the classes that have substan...
Different performance measures are used to assess the behaviour, and to carry out the comparison, of...
This paper presents a new approach to the problem of multiclass classification. The proposed approac...
The vast majority of statistical theory on binary classification characterizes performance in terms ...
Given a learning problem with real-world tradeoffs, which cost function should the model be trained ...
Though deep learning has been applied successfully in many scenarios, malicious inputs with human-im...
Deep neural networks have achieved great success on many tasks and even surpass human performance in...
Thesis (Ph.D.)--University of Washington, 2020The recent renaissance of deep neural networks has lea...
This paper presents an ensemble neural network and interval neutrosophic sets approach to the proble...
We propose an algorithm improvement for classifying machine learning algorithms with the fuzzificat...
How to train a binary neural network (BinaryNet) with both high compression rate and high accuracy o...
The state-of-the art machine learning approach to training deep neural networks, backpropagation, is...
Sample complexity results from computational learning theory, when applied to neural network learnin...
Neural network binarization accelerates deep models by quantizing their weights and activations into...
We present a method to train self-binarizing neural networks, that is, networks that evolve their we...
Deep neural networks (DNNs) are notorious for making more mistakes for the classes that have substan...
Different performance measures are used to assess the behaviour, and to carry out the comparison, of...
This paper presents a new approach to the problem of multiclass classification. The proposed approac...
The vast majority of statistical theory on binary classification characterizes performance in terms ...
Given a learning problem with real-world tradeoffs, which cost function should the model be trained ...
Though deep learning has been applied successfully in many scenarios, malicious inputs with human-im...
Deep neural networks have achieved great success on many tasks and even surpass human performance in...
Thesis (Ph.D.)--University of Washington, 2020The recent renaissance of deep neural networks has lea...