Efficient data collection in multimedia vehicular sensing platforms

Raffaele Bruno (Corresponding Author), Maddalena Nurchis

Research output: Contribution to journalArticleScientificpeer-review

10 Citations (Scopus)

Abstract

Vehicles provide an ideal platform for urban sensing applications, as they can be equipped with all kinds of sensing devices that can continuously monitor the environment around the travelling vehicle. In this work we are particularly concerned with the use of vehicles as building blocks of a multimedia mobile sensor system able to capture camera snapshots of the streets to support traffic monitoring and urban surveillance tasks. However, cameras are high data-rate sensors while wireless infrastructures used for vehicular communications may face performance constraints. Thus, data redundancy mitigation is of paramount importance in such systems. To address this issue in this paper we exploit submodular optimisation techniques to design efficient and robust data collection schemes for multimedia vehicular sensor networks. We also explore an alternative approach for data collection that operates on longer time scales and relies only on localised decisions rather than centralised computations. We use network simulations with realistic vehicular mobility patterns to verify the performance gains of our proposed schemes compared to a baseline solution that ignores data redundancy. Simulation results show that our data collection techniques can ensure a more accurate coverage of the road network while significantly reducing the amount of transferred data.
Original languageEnglish
Pages (from-to)78-95
JournalPervasive and Mobile Computing
Volume16
DOIs
Publication statusPublished - 2014
MoE publication typeA1 Journal article-refereed

Keywords

  • vehicular sensor networks
  • urban surveillance
  • submodular optimisation
  • performance evaluation

Fingerprint

Dive into the research topics of 'Efficient data collection in multimedia vehicular sensing platforms'. Together they form a unique fingerprint.

Cite this