Networks form an indispensable part of our lives. In particular, computer networks have ranked amongst the most influential networks in recent times. In such an ever-evolving and fast growing network, the primary concern is to understand and analyse different aspects of the network behaviour, such as the quality of service and efficient information propagation. It is also desirable to predict the behaviour of a large computer network if, for example, one of the computers is infected by a virus. In all of the aforementioned cases, we need protocols that are able to make local decisions and handle the dynamic changes in the network topology. Here, randomised algorithms are preferred because many deterministic algorithms often require a centra...
We study threshold-based load balancing protocols for weighted tasks. We are given an arbitrary grap...
We consider broadcasting in random d-regular graphs by using a simple modification of the random pho...
The Moran process, as studied by Lieberman, Hauert and Nowak, is a randomised algorithm modelling th...
Examples of large scale networks include the Internet, peer-to-peer networks, parallel computing sys...
This thesis studies a few randomized algorithms in application-layer peer-to-peer networks. The sign...
© 2006 Dr. Julie Anne CainRandom graph processes are most often used to investigate theoretical ques...
Printed on archival quality paper. Random graph processes are most often used to investigate theoret...
Randomness is a crucial component in the design and analysis of many efficient algorithms. This thes...
Randomness is a crucial component in the design and analysis of many efficient algorithms. This thes...
Randomness is a crucial component in the design and analysis of many efficient algorithms. This the...
This thesis studies random walks and its algorithmic applications in distributed networks. Random wa...
This thesis studies random walks and its algorithmic applications in distributed networks. Random wa...
This work studies the generalized Moran process, as introduced by Lieberman et al. (2005) [20]. We i...
This work studies the generalized Moran process, as introduced by Lieberman et al. [Nature, 433:312-...
The Moran process, as studied by Lieberman, Hauert and Nowak, is a randomised algorithm modelling th...
We study threshold-based load balancing protocols for weighted tasks. We are given an arbitrary grap...
We consider broadcasting in random d-regular graphs by using a simple modification of the random pho...
The Moran process, as studied by Lieberman, Hauert and Nowak, is a randomised algorithm modelling th...
Examples of large scale networks include the Internet, peer-to-peer networks, parallel computing sys...
This thesis studies a few randomized algorithms in application-layer peer-to-peer networks. The sign...
© 2006 Dr. Julie Anne CainRandom graph processes are most often used to investigate theoretical ques...
Printed on archival quality paper. Random graph processes are most often used to investigate theoret...
Randomness is a crucial component in the design and analysis of many efficient algorithms. This thes...
Randomness is a crucial component in the design and analysis of many efficient algorithms. This thes...
Randomness is a crucial component in the design and analysis of many efficient algorithms. This the...
This thesis studies random walks and its algorithmic applications in distributed networks. Random wa...
This thesis studies random walks and its algorithmic applications in distributed networks. Random wa...
This work studies the generalized Moran process, as introduced by Lieberman et al. (2005) [20]. We i...
This work studies the generalized Moran process, as introduced by Lieberman et al. [Nature, 433:312-...
The Moran process, as studied by Lieberman, Hauert and Nowak, is a randomised algorithm modelling th...
We study threshold-based load balancing protocols for weighted tasks. We are given an arbitrary grap...
We consider broadcasting in random d-regular graphs by using a simple modification of the random pho...
The Moran process, as studied by Lieberman, Hauert and Nowak, is a randomised algorithm modelling th...