Entropy is one of the most fundamental notions for understanding complexity. Among all the methods to calculate the entropy, sample entropy (SampEn) is a practical and common method to estimate time-series complexity. Unfortunately, SampEn is a time-consuming method growing in quadratic times with the number of elements, which makes this method unviable when processing large data series. In this work, we evaluate hardware SampEn architectures to offload computation weight, using improved SampEn algorithms and exploiting reconfigurable technologies, such as field-programmable gate arrays (FPGAs), a reconfigurable technology well-known for its high performance and power efficiency. In addition to the fundamental disclosed straightforward Samp...
Processing large volumes of information generally requires massive amounts of computational power, w...
In this work we describe a method to measure the computing performance and energy-efficiency to be e...
Processing large volumes of information generally requires massive amounts of computational power, w...
Approximate Entropy and especially Sample Entropy are recently frequently used algorithms for calcul...
Recent literature has reported the use of entropy measurements for anomaly detection purposes in IP ...
Summarization: Mutual Information (MI) and Transfer Entropy (TE) algorithms compute statistical meas...
Network traffic monitoring uses empirical entropy to detect anomalous events such as various types o...
This paper concentrates on the entropy estimation of time series. Two new algorithms are introduced:...
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regu...
This article discusses the use of entropy calculation on Field Programmable Gate Array (FPGA) for id...
The magnitude of the information content associated with a particular implementation of a Physical U...
In line with Shannon's ideas, we define the entropy of FPGA reconfiguration to be the amount of info...
Summarization: This paper investigates the effects of different design tool (Xilinx ISE) optimisatio...
With the advent of big data and cloud computing, there is tremendous interest in optimised algorithm...
FPGA devices used in the HPC context promise an increased energy efficiency, enhancing the computing...
Processing large volumes of information generally requires massive amounts of computational power, w...
In this work we describe a method to measure the computing performance and energy-efficiency to be e...
Processing large volumes of information generally requires massive amounts of computational power, w...
Approximate Entropy and especially Sample Entropy are recently frequently used algorithms for calcul...
Recent literature has reported the use of entropy measurements for anomaly detection purposes in IP ...
Summarization: Mutual Information (MI) and Transfer Entropy (TE) algorithms compute statistical meas...
Network traffic monitoring uses empirical entropy to detect anomalous events such as various types o...
This paper concentrates on the entropy estimation of time series. Two new algorithms are introduced:...
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regu...
This article discusses the use of entropy calculation on Field Programmable Gate Array (FPGA) for id...
The magnitude of the information content associated with a particular implementation of a Physical U...
In line with Shannon's ideas, we define the entropy of FPGA reconfiguration to be the amount of info...
Summarization: This paper investigates the effects of different design tool (Xilinx ISE) optimisatio...
With the advent of big data and cloud computing, there is tremendous interest in optimised algorithm...
FPGA devices used in the HPC context promise an increased energy efficiency, enhancing the computing...
Processing large volumes of information generally requires massive amounts of computational power, w...
In this work we describe a method to measure the computing performance and energy-efficiency to be e...
Processing large volumes of information generally requires massive amounts of computational power, w...