The analysis of several algorithms and data structures can be framed as a peeling process on a random hypergraph: vertices with degree less than k are removed until there are no vertices of degree less than k left. The remaining hypergraph is known as the k-core. In this paper, we analyze parallel peeling processes, where in each round, all vertices of degree less than k are removed. It is known that, below a specific edge density threshold, the k-core is empty with high probability. We show that, with high probability, below this threshold, only 1log((k−1)(r−1)) log logn+O(1) rounds of peeling are needed to obtain the empty k-core for r-uniform hypergraphs. Interestingly, we show that above this threshold, Ω(logn) rounds of peeling are req...
This paper presents several parallel algorithms on unweighted graphs for hypercube computers. The al...
We give an algorithm that, with high probability, recovers a planted k-partition in a random graph, ...
We propose a series of randomized greedy construction schemes for the hypergraph partitioning proble...
The analysis of several algorithms and data structures can be framed as a peeling process on a rando...
We describe a new family of k-uniform hypergraphs with independent random edges. The hypergraphs hav...
The computation of a peeling order in a randomly generated hypergraph is the most time-consuming ste...
The computation of a peeling order in a randomly generated hypergraph is the most time-consuming ste...
Abstract — The analysis of several algorithms and data struc-tures can be reduced to the analysis of...
Maintaining a $k$-core decomposition quickly in a dynamic graph has important applications in networ...
The (two) core of an hyper-graph is the maximal collection of hyper-edges within which no vertex app...
pubulished source © SIAM Inc. 2004We describe a technique for determining the thresholds for the app...
AbstractWe develop some general techniques for converting randomized parallel algorithms into determ...
This article presents parallel algorithms for component decomposition of graph structures on general...
In this paper, we present parallel multilevel algorithms for the hypergraph partitioning problem. In...
The k-core of a hypergraph is the unique subgraph where all vertices have degree at least k and whic...
This paper presents several parallel algorithms on unweighted graphs for hypercube computers. The al...
We give an algorithm that, with high probability, recovers a planted k-partition in a random graph, ...
We propose a series of randomized greedy construction schemes for the hypergraph partitioning proble...
The analysis of several algorithms and data structures can be framed as a peeling process on a rando...
We describe a new family of k-uniform hypergraphs with independent random edges. The hypergraphs hav...
The computation of a peeling order in a randomly generated hypergraph is the most time-consuming ste...
The computation of a peeling order in a randomly generated hypergraph is the most time-consuming ste...
Abstract — The analysis of several algorithms and data struc-tures can be reduced to the analysis of...
Maintaining a $k$-core decomposition quickly in a dynamic graph has important applications in networ...
The (two) core of an hyper-graph is the maximal collection of hyper-edges within which no vertex app...
pubulished source © SIAM Inc. 2004We describe a technique for determining the thresholds for the app...
AbstractWe develop some general techniques for converting randomized parallel algorithms into determ...
This article presents parallel algorithms for component decomposition of graph structures on general...
In this paper, we present parallel multilevel algorithms for the hypergraph partitioning problem. In...
The k-core of a hypergraph is the unique subgraph where all vertices have degree at least k and whic...
This paper presents several parallel algorithms on unweighted graphs for hypercube computers. The al...
We give an algorithm that, with high probability, recovers a planted k-partition in a random graph, ...
We propose a series of randomized greedy construction schemes for the hypergraph partitioning proble...