In the last years video has been swamping the Internet: websites, social networks, and business multimedia systems are adopting video as the most important form of communication and information. Video are normally accessed as a whole and are not indexed in the visual content. Thus, they are often uploaded as short, manually cut clips with user-provided annotations, keywords and tags for retrieval. In this paper, we propose a prototype multimedia system which addresses these two limitations: it overcomes the need of human intervention in the video setting, thanks to fully deep learning-based solutions, and decomposes the storytelling structure of the video into coherent parts. These parts can be shots, key-frames, scenes and semantically related stories, and are exploited to provide an automatic annotation of the visual content, so that parts of video can be easily retrieved. This also allows a principled re-use of the video itself: users of the platform can indeed produce new storytelling by means of multi-modal presentations, add text and other media, and propose a different visual organization of the content. We present the overall solution, and some experiments on the re-use capability of our platform in edutainment by conducting an extensive user evaluation with students from primary schools.
Attenzione! Scheda prodotto non ancora validata dall'Ateneo
|Data di pubblicazione:||2017|
|Titolo:||NeuralStory: an Interactive Multimedia System for Video Indexing and Re-use|
|Autori:||Baraldi, Lorenzo; Grana, Costantino; Cucchiara, Rita|
|Data del convegno:||19-21 June 2017|
|Nome del convegno:||15th International Workshop on Content-Based Multimedia Indexing|
|Luogo del convegno:||Florence, Italy|
|Titolo del libro:||Proceedings of the 15th International Workshop on Content-Based Multimedia Indexing|
|Appare nelle tipologie:||Relazione in Atti di Convegno|
I documenti presenti in Iris Unimore sono rilasciati con licenza Creative Commons Attribuzione - Non commerciale - Non opere derivate 3.0 Italia, salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris