It has been well proved that deep networks are efficient at extracting features from a given (source) labeled dataset. However, it is not always the case that they can generalize well to other (target) datasets which very often have a different underlying distribution. In this report, we evaluate four different domain adaptation techniques for image classification tasks: DeepCORAL, DeepDomainConfusion, CDAN and CDAN+E. These techniques are unsupervised given that the target dataset dopes not carry any labels during training phase. We evaluate model performance on the office-31 dataset. A link to the github repository of this report can be found here: https://github.com/agrija9/Deep-Unsupervised-Domain-Adaptation
Unsupervised domain adaptation aims to generalize the supervised model trained on a source domain to...
Abstract Unsupervised domain adaptation (UDA) aims at learning a classifier for an unlabeled target...
Deep neural networks, which usually require a large amount of labelled data during training process,...
The number of application areas of deep neural networks for image classification is continuously gro...
Machine learning has achieved great successes in the area of computer vision, especially in object r...
Most machine learning algorithms assume that training and test data are sampled from the same distri...
Recent studies reveal that a deep neural network can learn transferable features which generalize we...
This thesis addresses a critical problem in computer vision of dealing with dataset bias between sou...
Deep neural networks have been successfully applied in domain adaptation which uses the labeled data...
Deep Neural Networks (DNNs) trained on one dataset (source domain) do not perform well on another se...
Image classification has been used in many real-world applications such as self-driving cars, recomm...
In the presence of large sets of labeled data, Deep Learning DL has accomplished extraordinary trium...
Deep neural networks can learn powerful representations from massive amounts of labeled data; howeve...
In this thesis, we propose a novel unsupervised clean-noisy datasets adaptation algorithm based on s...
Unsupervised domain adaptation involves knowledge transfer from a labeled source to unlabeled target...
Unsupervised domain adaptation aims to generalize the supervised model trained on a source domain to...
Abstract Unsupervised domain adaptation (UDA) aims at learning a classifier for an unlabeled target...
Deep neural networks, which usually require a large amount of labelled data during training process,...
The number of application areas of deep neural networks for image classification is continuously gro...
Machine learning has achieved great successes in the area of computer vision, especially in object r...
Most machine learning algorithms assume that training and test data are sampled from the same distri...
Recent studies reveal that a deep neural network can learn transferable features which generalize we...
This thesis addresses a critical problem in computer vision of dealing with dataset bias between sou...
Deep neural networks have been successfully applied in domain adaptation which uses the labeled data...
Deep Neural Networks (DNNs) trained on one dataset (source domain) do not perform well on another se...
Image classification has been used in many real-world applications such as self-driving cars, recomm...
In the presence of large sets of labeled data, Deep Learning DL has accomplished extraordinary trium...
Deep neural networks can learn powerful representations from massive amounts of labeled data; howeve...
In this thesis, we propose a novel unsupervised clean-noisy datasets adaptation algorithm based on s...
Unsupervised domain adaptation involves knowledge transfer from a labeled source to unlabeled target...
Unsupervised domain adaptation aims to generalize the supervised model trained on a source domain to...
Abstract Unsupervised domain adaptation (UDA) aims at learning a classifier for an unlabeled target...
Deep neural networks, which usually require a large amount of labelled data during training process,...