In this paper, a new cache replacement policy named Selection Alternative Replacement (SAR), which minimizes shared cache miss rate in chip multi-processor architecture, is proposed. A variety of cache replacement policies have been used for minimizing the cache misses. However, replacing cache items which have high utilization leads to additional cache misses. SAR policy stores the labels of discarded cache items and uses stored information to prevent additional cache misses. The results of experiments show that the SAR policy decreases cache miss rate by 6.01% averagely and enhances instruction per cycle by 7.01% averagely compared with the conventional pseudo least recently used policy.This research was sponsored by Seoul R&BD Program (1...
The increasing number of threads inside the cores of a multicore processor, and competitive access t...
International audienceWe propose SCORE, a novel adaptive cache replacement policy, which uses a scor...
Despite extensive developments in improving cache hit rates, designing an optimal cache replacement ...
Poor cache memory management can have adverse impact on the overall system performance. In a Chip Mu...
Cache memory performance is an important factor in determining overall processor performance. In a m...
The performance loss resulting from different cache misses is variable in modern systems for two rea...
Modern microprocessors tend to use on-chip caches that are much smaller than the working set size of...
The increasing speed-gap between processor and memory and the limited memory bandwidth make last-lev...
Multicore processors have become ubiquitous, both in general-purpose and special-purpose application...
The increasing speed-gap between processor and memory and the limited memory bandwidth make last-lev...
Abstract — The increasing speed-gap between processor and memory and the limited memory bandwidth ma...
One of the dominant approaches towards implementing fast and high performance computer architectures...
Through taking advantage of locality in memory accesses, caching can be defined as a fundamental app...
An optimal replacement policy that minimizes the miss rate in a private cache was proposed several d...
The increasing speed-gap between processor and memory and the limited memory bandwidth make last-lev...
The increasing number of threads inside the cores of a multicore processor, and competitive access t...
International audienceWe propose SCORE, a novel adaptive cache replacement policy, which uses a scor...
Despite extensive developments in improving cache hit rates, designing an optimal cache replacement ...
Poor cache memory management can have adverse impact on the overall system performance. In a Chip Mu...
Cache memory performance is an important factor in determining overall processor performance. In a m...
The performance loss resulting from different cache misses is variable in modern systems for two rea...
Modern microprocessors tend to use on-chip caches that are much smaller than the working set size of...
The increasing speed-gap between processor and memory and the limited memory bandwidth make last-lev...
Multicore processors have become ubiquitous, both in general-purpose and special-purpose application...
The increasing speed-gap between processor and memory and the limited memory bandwidth make last-lev...
Abstract — The increasing speed-gap between processor and memory and the limited memory bandwidth ma...
One of the dominant approaches towards implementing fast and high performance computer architectures...
Through taking advantage of locality in memory accesses, caching can be defined as a fundamental app...
An optimal replacement policy that minimizes the miss rate in a private cache was proposed several d...
The increasing speed-gap between processor and memory and the limited memory bandwidth make last-lev...
The increasing number of threads inside the cores of a multicore processor, and competitive access t...
International audienceWe propose SCORE, a novel adaptive cache replacement policy, which uses a scor...
Despite extensive developments in improving cache hit rates, designing an optimal cache replacement ...