In most modern processor designs the L1 data cache has become a major consumer of power due to its increasing size and high frequency access rate. In order to reduce this power consumption, we propose in this paper a straightforward filtering technique. The mechanism is based on a highly accurate forwarding predictor that determines if a load instruction will take its corresponding data via forwarding from the load-store structure -thus avoiding the data cache access- or it should catch it from the data cache. Our simulation results show that 36% data cache power savings can be achieved on average, with a negligible performance penalty of 0.1%
We investigate some power efficient data cache designs that try to significantly reduce the cache en...
Leakage power in data cache memories represents a sizable fraction of total power consumption, and m...
If current technology scaling trends hold, leakage power dissipation will soon become the dominant s...
In most modern processor designs, the HW dedicated to store data and instructions (memory hierarchy)...
The first level data cache in modern processors has become a major consumer of energy due to its inc...
L1 data caches in high-performance processors continue to grow in set associativity. Higher associat...
High-performance processors use a large set–associative L1 data cache with multiple ports. As clock ...
As CPU data requests to the level-one (L1) data cache (DC) can represent as much as 25 % of an embed...
Abstract—Energy efficiency plays a crucial role in the design of embedded processors especially for ...
Energy efficiency is one of the key metrics in the design of a widerange of processor types. For exa...
The number of battery powered devices is growing significantly and these devices require energy-effi...
Leakage power in data cache memories represents a sizable fraction of total power consumption, and m...
L1 data caches in high-performance processors continue to grow in set associativity. Higher associat...
As CPU data requests to the level-one (L1) data cache (DC) can represent as much as 25% of an embedd...
While data filter caches (DFCs) have been shown to be effective at reducing data access energy, they...
We investigate some power efficient data cache designs that try to significantly reduce the cache en...
Leakage power in data cache memories represents a sizable fraction of total power consumption, and m...
If current technology scaling trends hold, leakage power dissipation will soon become the dominant s...
In most modern processor designs, the HW dedicated to store data and instructions (memory hierarchy)...
The first level data cache in modern processors has become a major consumer of energy due to its inc...
L1 data caches in high-performance processors continue to grow in set associativity. Higher associat...
High-performance processors use a large set–associative L1 data cache with multiple ports. As clock ...
As CPU data requests to the level-one (L1) data cache (DC) can represent as much as 25 % of an embed...
Abstract—Energy efficiency plays a crucial role in the design of embedded processors especially for ...
Energy efficiency is one of the key metrics in the design of a widerange of processor types. For exa...
The number of battery powered devices is growing significantly and these devices require energy-effi...
Leakage power in data cache memories represents a sizable fraction of total power consumption, and m...
L1 data caches in high-performance processors continue to grow in set associativity. Higher associat...
As CPU data requests to the level-one (L1) data cache (DC) can represent as much as 25% of an embedd...
While data filter caches (DFCs) have been shown to be effective at reducing data access energy, they...
We investigate some power efficient data cache designs that try to significantly reduce the cache en...
Leakage power in data cache memories represents a sizable fraction of total power consumption, and m...
If current technology scaling trends hold, leakage power dissipation will soon become the dominant s...