PicSys: Energy-Efficient Fast Image Search on Distributed Mobile Networks

Noor Felemban, Fidan Mehmeti, Hana Khamfroush, Zongqing Lu, Swati Rallapalli, Kevin S. Chan, Tom La Porta

Research output: Contribution to journalArticle

Abstract

Mobile devices collect a large amount of visual data that are useful for many applications. Searching for an object of interest over a network of mobile devices can aid human analysts in a variety of situations. However, processing the information on these devices is a challenge owing to the high computational complexity of the state-of-the-art computer vision algorithms that primarily rely on Convolutional Neural Networks (CNNs). Thus, this paper builds PicSys, a system that enables answering visual search queries on a mobile network. The objective of the system is to minimize the maximum completion time over all devices while taking into account the energy consumption of mobile devices as well. First, PicSys carefully divides the computation into multiple filtering stages, such that only a small percentage of images need to run the entire CNN pipeline. Splitting such CNN computation into multiple stages requires understanding the intermediate CNN features and systematically trading off accuracy for the computation speed. Second, PicSys determines where to run each of the stages of the multi-stage pipeline to fully utilize the available resources. Finally, through extensive experimentation, system implementation, and simulation, we show that PicSys performance is close to optimal and significantly outperforms other standard algorithms.

Original languageEnglish (US)
JournalIEEE Transactions on Mobile Computing
DOIs
Publication statusAccepted/In press - Jan 1 2019

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Cite this