Prefetching has been shown to be an effective technique for reducing user perceived latency in distributed systems. In this paper we show that even when prefetching adds no extra traffic to the network, it can have serious negative performance effects. Straightforward approaches to prefetching increase the burstiness of individual sources, leading to increased average queue sizes in network switches. However, we also show that applications can avoid the undesirable queueing effects of prefetching. In fact, we show that applications employing prefetching can significantly improve network performance, to a level much better than that obtained without any prefetching at all. This is because prefetching offers increased opportunities for traffi...
Prefetching is a potential method to reduce waiting time for retrieving data over wireless network c...
this paper, we examine the way in which prefetching can exploit parallelism. Prefetching has been st...
International audienceDeveloping efficient distributed applications while ...
We investigate speculative prefetching under a model in which prefetching is neither aborted nor pre...
Although Internet service providers and communications companies are continuously offering higher an...
International audiencePrefetching is a basic mechanism in the World Wide Web that speculates on the ...
This thesis considers two approaches to the design of high-performance computers. In a <I>single pro...
Previous studies in speculative prefetching focus on building and evaluating access models for the p...
The long-term success of the World Wide Web depends on fast response time. People use the Web to acc...
Prefetching is a common method to prevent memory stalls, but it depends on the time sensitive return...
We study the performance of multiuser document prefetching in a two-tier heterogeneous wireless syst...
International audienceIn multi-core systems, an application's prefetcher can interfere with the memo...
Data prefetching is an eective technique for hiding memory la-tency. When issued prefetches are inac...
Prefetching and caching are techniques commonly used in I/O systems to reduce latency. Many research...
The long-term success of the World Wide Web depends on fast response time. People use the Web to acc...
Prefetching is a potential method to reduce waiting time for retrieving data over wireless network c...
this paper, we examine the way in which prefetching can exploit parallelism. Prefetching has been st...
International audienceDeveloping efficient distributed applications while ...
We investigate speculative prefetching under a model in which prefetching is neither aborted nor pre...
Although Internet service providers and communications companies are continuously offering higher an...
International audiencePrefetching is a basic mechanism in the World Wide Web that speculates on the ...
This thesis considers two approaches to the design of high-performance computers. In a <I>single pro...
Previous studies in speculative prefetching focus on building and evaluating access models for the p...
The long-term success of the World Wide Web depends on fast response time. People use the Web to acc...
Prefetching is a common method to prevent memory stalls, but it depends on the time sensitive return...
We study the performance of multiuser document prefetching in a two-tier heterogeneous wireless syst...
International audienceIn multi-core systems, an application's prefetcher can interfere with the memo...
Data prefetching is an eective technique for hiding memory la-tency. When issued prefetches are inac...
Prefetching and caching are techniques commonly used in I/O systems to reduce latency. Many research...
The long-term success of the World Wide Web depends on fast response time. People use the Web to acc...
Prefetching is a potential method to reduce waiting time for retrieving data over wireless network c...
this paper, we examine the way in which prefetching can exploit parallelism. Prefetching has been st...
International audienceDeveloping efficient distributed applications while ...