In a disk I/O-intensive online server, sequential data accesses of one application instance can be frequently interrupted by other concurrent processes. Although aggressive I/O prefetching can improve the granularity of sequential data access, it must control the I/O bandwidth wasted on prefetching unneeded data. In this paper, we propose a competitive prefetching strategy that balances the overhead of disk I/O switching and that of unnecessary prefetching. Based on a simple model, we show that the performance of our strategy (in terms of I/O throughput) is at least half that of the optimal offline policy. We have implemented competitive prefetching in the Linux 2.6.3 kernel and conducted experiments based on microbenchmarks and two real ap...
This paper presents cooperative prefetching and caching — the use of network-wide global resources (...
Optimizing collective input/output (I/O) is important for improving throughput of parallel scientic ...
AbstractWe study integrated prefetching and caching in single and parallel disk systems. In the firs...
Recent studies on operating system support for highly concurrent online servers mostly target CPU-in...
Thesis (Ph. D.)--University of Rochester. Dept. of Computer Science, 2008.Recent studies on operatin...
this paper, we examine the way in which prefetching can exploit parallelism. Prefetching has been st...
In this thesis we study prefetching and buffer management algorithms for parallel I/O systems. Two m...
International audienceIn multi-core systems, an application's prefetcher can interfere with the memo...
Abstract—We address the problem of prefetching and caching in a parallel I/O system and present a ne...
We provide a competitive analysis framework for online prefetching and buffer management algorithms ...
High-performance I/O systems depend on prefetching and caching in order to deliver good performance ...
We provide a competitive analysis framework for online prefetching and buffer management algorithms ...
Parallel applications can benefit greatly from massive computational capability, but their performan...
Multiple memory models have been proposed to capture the effects of memory hierarchy culminating in ...
Abstract—In this paper, we present an informed prefetching technique called IPODS that makes use of ...
This paper presents cooperative prefetching and caching — the use of network-wide global resources (...
Optimizing collective input/output (I/O) is important for improving throughput of parallel scientic ...
AbstractWe study integrated prefetching and caching in single and parallel disk systems. In the firs...
Recent studies on operating system support for highly concurrent online servers mostly target CPU-in...
Thesis (Ph. D.)--University of Rochester. Dept. of Computer Science, 2008.Recent studies on operatin...
this paper, we examine the way in which prefetching can exploit parallelism. Prefetching has been st...
In this thesis we study prefetching and buffer management algorithms for parallel I/O systems. Two m...
International audienceIn multi-core systems, an application's prefetcher can interfere with the memo...
Abstract—We address the problem of prefetching and caching in a parallel I/O system and present a ne...
We provide a competitive analysis framework for online prefetching and buffer management algorithms ...
High-performance I/O systems depend on prefetching and caching in order to deliver good performance ...
We provide a competitive analysis framework for online prefetching and buffer management algorithms ...
Parallel applications can benefit greatly from massive computational capability, but their performan...
Multiple memory models have been proposed to capture the effects of memory hierarchy culminating in ...
Abstract—In this paper, we present an informed prefetching technique called IPODS that makes use of ...
This paper presents cooperative prefetching and caching — the use of network-wide global resources (...
Optimizing collective input/output (I/O) is important for improving throughput of parallel scientic ...
AbstractWe study integrated prefetching and caching in single and parallel disk systems. In the firs...