The reuse distance (least recently used (LRU) stack distance) is an essential metric for performance prediction and optimization of storage cache. Over the past four decades, there have been steady improvements in the algorithmic efficiency of reuse distance measurement. This progress is accelerating in recent years, both in theory and practical implementation. In this article, we present a kinetic model of LRU cache memory, based on the average eviction time (AET) of the cached data. The AET model enables fast measurement and use of low-cost sampling. It can produce the miss ratio curve in linear time with extremely low space costs. On storage trace benchmarks, AET reduces the time and space costs compared to former techniques. Furthermore...
Performance metrics and models are prerequisites for scientific understanding and optimization. This...
Improving cache performance requires understanding cache behavior. However, measuring cache performa...
Modern processors use high-performance cache replacement policies that outperform traditional altern...
The reuse distance (least recently used (LRU) stack distance) is an essential metric for performance...
Locality, characterized by data reuses, determines caching performance. Reuse distance (i.e. LRU st...
We develop a reuse distance/stack distance based analytical modeling framework for efficient, online...
The Miss Ratio Curve (MRC) is an important metric and effective tool for caching system performance ...
An accurate, tractable, analytic cache model for time-shared systems is presented, which estimates t...
A scalar metric for temporal locality is proposed. The metric is based on LRU stack distance. This p...
The cache Miss Ratio Curve (MRC) serves a variety of purposes such as cache partitioning, applicatio...
Cache is one of the most widely used components in today's computing systems. Its performance is hea...
In this work, we study systems with two levels of memory: a fixed-size cache, and a backing store, e...
Feedback-directed optimization has become an increasingly important tool in designing and building o...
Feedback-directed optimization has become an increasingly impor-tant tool in designing and building ...
Modern processors use high-performance cache replacement policies that outperform traditional altern...
Performance metrics and models are prerequisites for scientific understanding and optimization. This...
Improving cache performance requires understanding cache behavior. However, measuring cache performa...
Modern processors use high-performance cache replacement policies that outperform traditional altern...
The reuse distance (least recently used (LRU) stack distance) is an essential metric for performance...
Locality, characterized by data reuses, determines caching performance. Reuse distance (i.e. LRU st...
We develop a reuse distance/stack distance based analytical modeling framework for efficient, online...
The Miss Ratio Curve (MRC) is an important metric and effective tool for caching system performance ...
An accurate, tractable, analytic cache model for time-shared systems is presented, which estimates t...
A scalar metric for temporal locality is proposed. The metric is based on LRU stack distance. This p...
The cache Miss Ratio Curve (MRC) serves a variety of purposes such as cache partitioning, applicatio...
Cache is one of the most widely used components in today's computing systems. Its performance is hea...
In this work, we study systems with two levels of memory: a fixed-size cache, and a backing store, e...
Feedback-directed optimization has become an increasingly important tool in designing and building o...
Feedback-directed optimization has become an increasingly impor-tant tool in designing and building ...
Modern processors use high-performance cache replacement policies that outperform traditional altern...
Performance metrics and models are prerequisites for scientific understanding and optimization. This...
Improving cache performance requires understanding cache behavior. However, measuring cache performa...
Modern processors use high-performance cache replacement policies that outperform traditional altern...