In this paper we propose an approach to indoor environment surveillance and, in particular, to people behaviour control in home automation context. The reference application is a silent and automatic control of the behaviour of people living alone in the house and specially conceived for people with limited autonomy (e.g., elders or disabled people). The aim is to detect dangerous events (such as a person falling down) and to react to these events by establishing a remote connection with low-performance clients, such as PDA (Personal Digital Assistant). To this aim, we propose an integrated server architecture, typically connected in intranet with network cameras, able to segment and track objects of interest; in the case of objects classified as people, the system must also evaluate the people posture and infer possible dangerous situations. Finally, the system is equipped with a specifically designed transcoding server to adapt the video content to PDA requirements (display area and bandwidth) and to the user's requests. The main issues of the proposal are a reliable real-time object detector and tracking module, a simple but effective posture classifier improved by a supervised learning phase, and an high performance transcoding inspired on MPEG-4 object-level standard, tailored to PDA. Results on different video sequences and performance analysis are discussed.
|Data di pubblicazione:||2003|
|Titolo:||Computer Vision Techniques for PDA Accessibility of In-House Video Surveillance|
|Autori:||R. Cucchiara; C. Grana; A. Prati; R. Vezzani|
|Data del convegno:||Nov 2-8|
|Nome del convegno:||First ACM SIGMM international workshop on Video surveillance|
|Luogo del convegno:||Berkeley, California|
|Titolo del libro:||First ACM SIGMM international workshop on Video surveillance|
|Appare nelle tipologie:||Relazione in Atti di Convegno|
File in questo prodotto:
I documenti presenti in Iris Unimore sono rilasciati con licenza Creative Commons Attribuzione - Non commerciale - Non opere derivate 3.0 Italia, salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris