Regular decomposition of large graphs and other structures: Scalability and robustness towards missing data

Hannu Reittu, Ilkka Norros, Fülöp Bazsó

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

4 Citations (Scopus)

Abstract

A method for compression of large graphs and matrices to a block structure is further developed. Szemerédi's regularity lemma is used as a generic motivation of the significance of stochastic block models. Another ingredient of the method is Rissanen's minimum description length principle (MDL). We continue our previous work on the subject, considering cases of missing data and scaling of algorithms to extremely large size of graphs. In this way it would be possible to find out a large scale structure of a huge graphs of certain type using only a tiny part of graph information and obtaining a compact representation of such graphs useful in computations and visualization.

Original languageEnglish
Title of host publication2017 IEEE International Conference on Big Data (Big Data)
EditorsZoran Obradovic, Ricardo Baeza-Yates, Jeremy Kepner, Raghunath Nambiar, Chonggang Wang, Masashi Toyoda, Toyotaro Suzumura, Xiaohua Hu, Alfredo Cuzzocrea, Ricardo Baeza-Yates, Jian Tang, Hui Zang, Jian-Yun Nie, Rumi Ghosh
PublisherIEEE Institute of Electrical and Electronic Engineers
Pages3352-3357
Number of pages6
Volume2018-January
ISBN (Electronic)978-1-5386-2715-0 , 978-1-5386-2714-3
ISBN (Print)978-1-5386-2716-7
DOIs
Publication statusPublished - 2017
MoE publication typeA4 Article in a conference publication
EventIEEE International Conference on Big Data (Big Data) - Boston, United States
Duration: 11 Dec 201714 Dec 2017

Conference

ConferenceIEEE International Conference on Big Data (Big Data)
CountryUnited States
CityBoston
Period11/12/1714/12/17

Keywords

  • regular decomposition
  • big data
  • graph analysis
  • sampling

Fingerprint Dive into the research topics of 'Regular decomposition of large graphs and other structures: Scalability and robustness towards missing data'. Together they form a unique fingerprint.

Cite this