Can quantum computers be used for implementing machine learning models that are better than traditional methods, and are such methods suitable for today’s noisy quantum hardware? In this thesis we made a Python framework for implementing machine learning models based on parameterized quantum circuits that are evaluated on quantum hardware. The framework is capable of implementing quantum neural networks (QNNs) and quantum circuit networks (QCNs), and train them using gradient-based method. To calculate the gradient of quantum circuit networks, we developed a backpropagation algorithm based on the parameters shift rule that leverage both classical and quantum hardware. We performed a numerical study where we sought to characterize how dense ...
Quantum Computing leverages the quantum properties of subatomic matter to enable computations faster...
The application of near-term quantum devices to machine learning (ML) has attracted much attention. ...
The resurgence of self-supervised learning, whereby a deep learning model generates its own supervis...
Quantum machine learning has proven to be a fruitful area in which to search for potential applicati...
The goal of generative machine learning is to model the probability distribution underlying a given ...
Machine learning algorithms based on parametrized quantum circuits are prime candidates for near-ter...
Large faulttolerant universal gate quantum computers will provide a major speedup to a variety of ...
Artificial neural networks are at the heart of modern deep learning algorithms. We describe how to e...
Quantum computing represents a promising paradigm for solving complex problems, such as large-number...
Quantum computing represents a promising paradigm for solving complex problems, such as large-number...
Quantum computing represents a promising paradigm for solving complex problems, such as large-number...
Machine learning (ML) has revolutionized the world in recent years. Despite the success, the huge co...
Quantum neural networks hold significant promise for numerous applications, particularly as they can...
Mainstream machine-learning techniques such as deep learning and probabilistic programming rely heav...
Near-term quantum devices can be used to build quantum machine learning models, such as quantum kern...
Quantum Computing leverages the quantum properties of subatomic matter to enable computations faster...
The application of near-term quantum devices to machine learning (ML) has attracted much attention. ...
The resurgence of self-supervised learning, whereby a deep learning model generates its own supervis...
Quantum machine learning has proven to be a fruitful area in which to search for potential applicati...
The goal of generative machine learning is to model the probability distribution underlying a given ...
Machine learning algorithms based on parametrized quantum circuits are prime candidates for near-ter...
Large faulttolerant universal gate quantum computers will provide a major speedup to a variety of ...
Artificial neural networks are at the heart of modern deep learning algorithms. We describe how to e...
Quantum computing represents a promising paradigm for solving complex problems, such as large-number...
Quantum computing represents a promising paradigm for solving complex problems, such as large-number...
Quantum computing represents a promising paradigm for solving complex problems, such as large-number...
Machine learning (ML) has revolutionized the world in recent years. Despite the success, the huge co...
Quantum neural networks hold significant promise for numerous applications, particularly as they can...
Mainstream machine-learning techniques such as deep learning and probabilistic programming rely heav...
Near-term quantum devices can be used to build quantum machine learning models, such as quantum kern...
Quantum Computing leverages the quantum properties of subatomic matter to enable computations faster...
The application of near-term quantum devices to machine learning (ML) has attracted much attention. ...
The resurgence of self-supervised learning, whereby a deep learning model generates its own supervis...