DocumentCode :
1633592
Title :
A sparsity-driven approach to multi-camera tracking in visual sensor networks
Author :
Cosar, Serhan ; Cetin, Mujdat
Author_Institution :
STARS team, INRIA Sophia Antipolis, Sophia Antipolis, France
fYear :
2013
Firstpage :
407
Lastpage :
413
Abstract :
In this paper, a sparsity-driven approach is presented for multi-camera tracking in visual sensor networks (VSNs). VSNs consist of image sensors, embedded processors and wireless transceivers which are powered by batteries. Since the energy and bandwidth resources are limited, setting up a tracking system in VSNs is a challenging problem. Motivated by the goal of tracking in a bandwidth-constrained environment, we present a sparsity-driven method to compress the features extracted by the camera nodes, which are then transmitted across the network for distributed inference. We have designed special overcomplete dictionaries that match the structure of the features, leading to very parsimonious yet accurate representations. We have tested our method in indoor and outdoor people tracking scenarios. Our experimental results demonstrate how our approach leads to communication savings without significant loss in tracking performance.
Keywords :
embedded systems; image sensors; transceivers; video surveillance; VSN; bandwidth-constrained environment; camera nodes; distributed inference; embedded processors; features extraction; image sensors; indoor people tracking scenarios; limited bandwidth resources; limited energy resources; multicamera tracking; outdoor people tracking scenarios; sparsity-driven approach; sparsity-driven method; tracking system; visual sensor networks; wireless transceivers; Bandwidth; Cameras; Dictionaries; Feature extraction; Hidden Markov models; Image coding; Image color analysis;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Advanced Video and Signal Based Surveillance (AVSS), 2013 10th IEEE International Conference on
Conference_Location :
Krakow
Type :
conf
DOI :
10.1109/AVSS.2013.6636674
Filename :
6636674
Link To Document :
بازگشت