Backdoor attacks mislead machine-learning models to output an attacker-specified class when presented a specific trigger at test time. These attacks require poisoning the training data to compromise the learning algorithm, e.g., by injecting poisoning samples containing the trigger into the training set, along with the desired class label. Despite the increasing number of studies on backdoor attacks and defenses, the underlying factors affecting the success of backdoor attacks, along with their impact on the learning algorithm, are not yet well understood. In this work, we aim to shed light on this issue by unveiling that backdoor attacks induce a smoother decision function around the triggered samples - a phenomenon which we refer to as ba...
A Backdoor attack (BA) is an important type of adversarial attack against deep neural network classi...
The backdoor or Trojan attack is a severe threat to deep neural networks (DNNs). Researchers find th...
Backdoor attacks threaten Deep Neural Networks (DNNs). Towards stealthiness, researchers propose cle...
Backdoor attacks mislead machine-learning models to output an attacker-specified class when presente...
With the success of deep learning algorithms in various domains, studying adversarial attacks to sec...
Deep neural networks (DNNs) are widely deployed today, from image classification to voice recognitio...
Deep learning has made tremendous success in the past decade. As a result, it is becoming widely dep...
Backdoor attacks are rapidly emerging threats to deep neural networks (DNNs). In the backdoor attack...
The data poisoning attack has raised serious security concerns on the safety of deep neural networks...
Backdoor attack is a type of serious security threat to deep learning models.An adversary can provid...
Backdoor attacks against CNNs represent a new threat against deep learning systems, due to the possi...
This electronic version was submitted by the student author. The certified thesis is available in th...
Backdoors are powerful attacks against deep neural networks (DNNs). By poisoning training data, atta...
Outsourced training and machine learning as a service have resulted in novel attack vectors like bac...
The growing dependence on machine learning in real-world applications emphasizes the importance of u...
A Backdoor attack (BA) is an important type of adversarial attack against deep neural network classi...
The backdoor or Trojan attack is a severe threat to deep neural networks (DNNs). Researchers find th...
Backdoor attacks threaten Deep Neural Networks (DNNs). Towards stealthiness, researchers propose cle...
Backdoor attacks mislead machine-learning models to output an attacker-specified class when presente...
With the success of deep learning algorithms in various domains, studying adversarial attacks to sec...
Deep neural networks (DNNs) are widely deployed today, from image classification to voice recognitio...
Deep learning has made tremendous success in the past decade. As a result, it is becoming widely dep...
Backdoor attacks are rapidly emerging threats to deep neural networks (DNNs). In the backdoor attack...
The data poisoning attack has raised serious security concerns on the safety of deep neural networks...
Backdoor attack is a type of serious security threat to deep learning models.An adversary can provid...
Backdoor attacks against CNNs represent a new threat against deep learning systems, due to the possi...
This electronic version was submitted by the student author. The certified thesis is available in th...
Backdoors are powerful attacks against deep neural networks (DNNs). By poisoning training data, atta...
Outsourced training and machine learning as a service have resulted in novel attack vectors like bac...
The growing dependence on machine learning in real-world applications emphasizes the importance of u...
A Backdoor attack (BA) is an important type of adversarial attack against deep neural network classi...
The backdoor or Trojan attack is a severe threat to deep neural networks (DNNs). Researchers find th...
Backdoor attacks threaten Deep Neural Networks (DNNs). Towards stealthiness, researchers propose cle...