A method for compression of large graphs and matrices to a block structure is further developed. Szemerédi's regularity lemma is used as a generic motivation of the significance of stochastic block models. Another ingredient of the method is Rissanen's minimum description length principle (MDL). We continue our previous work on the subject, considering cases of missing data and scaling of algorithms to extremely large size of graphs. In this way it would be possible to find out a large scale structure of a huge graphs of certain type using only a tiny part of graph information and obtaining a compact representation of such graphs useful in computations and visualization
Abstract. The Szemerédi Regularity Lemma states that any sufficiently large graph G can be partitio...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
Motivated by the prevalent data science applications of processing and mining large-scale graph data...
A method for compression of large graphs and matrices to a block structure is further developed. Sze...
We analyze the performance of regular decomposition, a method for compression of large and dense gra...
Abstract We analyze the performance of regular decomposition, a method for compression of large and ...
Abstract We analyze the performance of regular decomposition, a method for compression o...
How can we separate structural information from noise in large graphs? To address this fundamental q...
Statistical analysis of large and sparse graphs is a challenging problem in data science due to the ...
Introduced in the mid-1970s as an intermediate step in proving a long-standing conjecture on arithme...
In today’s world, compression is a fundamental technique to let our computers deal in an efficient m...
We describe and illustrate a novel algorithm for clustering a large number of time series into few '...
In this paper we analyze the practical implications of Szemerédi’s regularity lemma in the preservat...
Abstract. The Szemerédi Regularity Lemma states that any sufficiently large graph G can be partitio...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
Motivated by the prevalent data science applications of processing and mining large-scale graph data...
A method for compression of large graphs and matrices to a block structure is further developed. Sze...
We analyze the performance of regular decomposition, a method for compression of large and dense gra...
Abstract We analyze the performance of regular decomposition, a method for compression of large and ...
Abstract We analyze the performance of regular decomposition, a method for compression o...
How can we separate structural information from noise in large graphs? To address this fundamental q...
Statistical analysis of large and sparse graphs is a challenging problem in data science due to the ...
Introduced in the mid-1970s as an intermediate step in proving a long-standing conjecture on arithme...
In today’s world, compression is a fundamental technique to let our computers deal in an efficient m...
We describe and illustrate a novel algorithm for clustering a large number of time series into few '...
In this paper we analyze the practical implications of Szemerédi’s regularity lemma in the preservat...
Abstract. The Szemerédi Regularity Lemma states that any sufficiently large graph G can be partitio...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
Motivated by the prevalent data science applications of processing and mining large-scale graph data...