People tracking deals with problems of shape changes, self-occlusions and track occlusions due to other interfering tracks and fixed objects that hide parts of the people shape. These problems are more critical in indoor surveillance and in particular in home automation settings, in which the need to merge information obtained form different cameras distributed around the house calls for the integration of reliable data obtained during time. Therefore, tracking algorithms should be carefully tuned to cope with occlusions and shape changes, working not only at pixel level but also at region level. In this work we provide a novel technique for object tracking, based on probabilistic masks and appearance models. Occlusions due to other tracks or due to background objects and false occlusions are discriminated. The classification of occluded regions of the track is exploited in a selective model update. The tracking system is general enough to be applied with any motion segmentation module, it can track people interacting each other and it maintains the pixel to track assignment even with large occlusions. At the same time, the model update is very reactive, so as to cope with sudden body motion and silhouette's shape changes. Due to its robustness, it has been used in different experiments of people behavior control in indoor situations.
|Data di pubblicazione:||2004|
|Titolo:||Track-based and object-based occlusion for people tracking refinement in indoor surveillance|
|Autori:||R. Cucchiara; C. Grana; G. Tardini|
|Data del convegno:||Oct 15-16|
|Nome del convegno:||2nd International Workshop on Video Surveillance & Sensor Networks|
|Luogo del convegno:||New York|
|Titolo del libro:||Proceedings of the ACM 2nd International Workshop on Video Surveillance & Sensor Networks|
|Appare nelle tipologie:||Relazione in Atti di Convegno|
File in questo prodotto:
I documenti presenti in Iris Unimore sono rilasciati con licenza Creative Commons Attribuzione - Non commerciale - Non opere derivate 3.0 Italia, salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris