Recent increases in CPU performance have outpaced increases in hard drive performance. As a result, disk operations have become more expensive in terms of CPU cycles spent waiting for disk operations to complete. File pre- diction can mitigate this problem by prefetching files into cache before they are accessed. However, incorrect prediction is to a certain degree both unavoidable and costly. We present the Program-based Last N Successors (PLNS) file prediction model that identifies relationships between files through the names of the programs accessing them. Our simulation results show that PLNS makes at least 21.11% fewer incorrect predictions and roughly the same number of correct predictions as the last-successor model. We also examine...
Traditional caches employ the LRU management policy to drive replacement decisions. However, previou...
When picking a cache replacement policy for file systems, LRU (Least Recently Used) has always been ...
Modern processors use high-performance cache replacement policies that outperform traditional altern...
Recent increases in CPU performance have outpaced in-creases in hard drive performance. As a result,...
Recent increases in CPU performance have surpassed those in hard drives. As a result, disk operation...
Recent increases in CPU performance have outpaced increases in hard drive performance. As a result, ...
Recent increases in CPU performance have outpaced increases in hard drive performance. As a result, ...
Prefetching multiple files per prediction can improve the predictive accuracy. However, it comes wit...
We have adapted a multi-order context modeling technique used in the data compression method Predict...
This paper describes the design, implementation, and evaluation of a predictive file caching approac...
Despite impressive advances in file system throughput resulting from technologies such as high-bandw...
Neural networks have been widely applied to various research and production fields. However, most re...
File prefetching based on previous file access patterns has been shown to be an effective means of r...
Memory latency is a key bottleneck for many programs. Caching and prefetching are two popular hardwa...
Traditional caches employ the LRU management policy to drive replacement decisions. However, previou...
When picking a cache replacement policy for file systems, LRU (Least Recently Used) has always been ...
Modern processors use high-performance cache replacement policies that outperform traditional altern...
Recent increases in CPU performance have outpaced in-creases in hard drive performance. As a result,...
Recent increases in CPU performance have surpassed those in hard drives. As a result, disk operation...
Recent increases in CPU performance have outpaced increases in hard drive performance. As a result, ...
Recent increases in CPU performance have outpaced increases in hard drive performance. As a result, ...
Prefetching multiple files per prediction can improve the predictive accuracy. However, it comes wit...
We have adapted a multi-order context modeling technique used in the data compression method Predict...
This paper describes the design, implementation, and evaluation of a predictive file caching approac...
Despite impressive advances in file system throughput resulting from technologies such as high-bandw...
Neural networks have been widely applied to various research and production fields. However, most re...
File prefetching based on previous file access patterns has been shown to be an effective means of r...
Memory latency is a key bottleneck for many programs. Caching and prefetching are two popular hardwa...
Traditional caches employ the LRU management policy to drive replacement decisions. However, previou...
When picking a cache replacement policy for file systems, LRU (Least Recently Used) has always been ...
Modern processors use high-performance cache replacement policies that outperform traditional altern...