Equalization and channel decoding are “traditionally” two cascade processes at the receiver side of a digital transmission. They aim to achieve a reliable and efficient transmission. For high data rates, the energy consumption of their corresponding algorithms is expected to become a limiting factor. For mobile devices with limited battery’s size, the energy consumption, mirrored in the lifetime of the battery, becomes even more crucial. Therefore, an energy-efficient implementation of equalization and decoding algorithms is desirable. The prevailing way is by increasing the energy efficiency of the underlying digital circuits. However, we address here promising alternatives offered by mixed (analog/digital) circuits. We are concerned with ...
textFor decades, the semiconductor industry enjoyed exponential improvements in microprocessor power...
In this work, we demonstrate the offline FPGA realization of both recurrent and feedforward neural n...
Activation functions represent an essential element in all neural networks structures. They influenc...
The human brain has the capability to organize the neurons (experience-adapted connections) to perfo...
Thesis (Ph. D.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program...
Neural networks are currently implemented on digital Von Neumann machines, which do not fully levera...
English In this thesis we are concerned with the hardware implementation of learning algorithms for...
There is an urgent need for compact, fast, and power-efficient hardware implementations of state-of-...
In this chapter, we present an overview of the recent advances in analog-to-digital converter (ADC) ...
To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose kno...
Analog neural networks with feedback can be used to implement l(Winner-Take-All (KWTA) networks. In ...
In the paper, original improvements of recurrent analog neural networks, which are based on Kalman f...
Research presented in this thesis provides a substantial leap from the study of interesting devi...
In this paper a new approach to the equalization of digital transmission channels is introduced and ...
We present experimental results on supervised learning of dynamical features in an analog VLSI neura...
textFor decades, the semiconductor industry enjoyed exponential improvements in microprocessor power...
In this work, we demonstrate the offline FPGA realization of both recurrent and feedforward neural n...
Activation functions represent an essential element in all neural networks structures. They influenc...
The human brain has the capability to organize the neurons (experience-adapted connections) to perfo...
Thesis (Ph. D.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program...
Neural networks are currently implemented on digital Von Neumann machines, which do not fully levera...
English In this thesis we are concerned with the hardware implementation of learning algorithms for...
There is an urgent need for compact, fast, and power-efficient hardware implementations of state-of-...
In this chapter, we present an overview of the recent advances in analog-to-digital converter (ADC) ...
To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose kno...
Analog neural networks with feedback can be used to implement l(Winner-Take-All (KWTA) networks. In ...
In the paper, original improvements of recurrent analog neural networks, which are based on Kalman f...
Research presented in this thesis provides a substantial leap from the study of interesting devi...
In this paper a new approach to the equalization of digital transmission channels is introduced and ...
We present experimental results on supervised learning of dynamical features in an analog VLSI neura...
textFor decades, the semiconductor industry enjoyed exponential improvements in microprocessor power...
In this work, we demonstrate the offline FPGA realization of both recurrent and feedforward neural n...
Activation functions represent an essential element in all neural networks structures. They influenc...