The large number of video lectures within digital archives is making critical the indexing and retrieval process. Indeed, most of the systems base the indexing process on few generic text information (e.g., course title and teacher's name) and this creates problems to students who are looking for very specific topics and hence want to browse video in details. Moreover, additional metadata could provide useful information to those users who access the educational materials by means of assistive technologies. In this paper, we propose an approach that allows students to take on-the-fly notes while watching a video lecture and uses these notes to enrich video lectures with metadata that will be helpful to the indexing and retrieval process. In particular, to allow detailed video browsing, our proposed SOcial Learning (SOLE) system defines a set of eight predefined tagnotes and segments the lecture into a sequence of video chapters. Students can use the textual notes to describe and to retrieve the video material, providing hints about its content to users with special needs. To evaluate our approach, we developed a prototype version of SOLE and we asked for volunteer evaluators. Results showed that users feel comfortable in taking notes while watching a video and liked to browse video lectures using notes. According to the results obtained in the evaluation, both students and video lecture providers might appreciate the proposed approach.

On using on-the-fly students' notes in video lecture indexing / Furini, Marco; Mirri, Silvia. - (2017), pp. 1083-1088. (Intervento presentato al convegno 14th IEEE Annual Consumer Communications and Networking Conference, CCNC 2017 tenutosi a usa nel 2017) [10.1109/CCNC.2017.7983290].

On using on-the-fly students' notes in video lecture indexing

Furini, Marco;
2017

Abstract

The large number of video lectures within digital archives is making critical the indexing and retrieval process. Indeed, most of the systems base the indexing process on few generic text information (e.g., course title and teacher's name) and this creates problems to students who are looking for very specific topics and hence want to browse video in details. Moreover, additional metadata could provide useful information to those users who access the educational materials by means of assistive technologies. In this paper, we propose an approach that allows students to take on-the-fly notes while watching a video lecture and uses these notes to enrich video lectures with metadata that will be helpful to the indexing and retrieval process. In particular, to allow detailed video browsing, our proposed SOcial Learning (SOLE) system defines a set of eight predefined tagnotes and segments the lecture into a sequence of video chapters. Students can use the textual notes to describe and to retrieve the video material, providing hints about its content to users with special needs. To evaluate our approach, we developed a prototype version of SOLE and we asked for volunteer evaluators. Results showed that users feel comfortable in taking notes while watching a video and liked to browse video lectures using notes. According to the results obtained in the evaluation, both students and video lecture providers might appreciate the proposed approach.
2017
gen-2017
14th IEEE Annual Consumer Communications and Networking Conference, CCNC 2017
usa
2017
1083
1088
Furini, Marco; Mirri, Silvia
On using on-the-fly students' notes in video lecture indexing / Furini, Marco; Mirri, Silvia. - (2017), pp. 1083-1088. (Intervento presentato al convegno 14th IEEE Annual Consumer Communications and Networking Conference, CCNC 2017 tenutosi a usa nel 2017) [10.1109/CCNC.2017.7983290].
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1147554
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 4
social impact