Next generation of industrial revolution will be featured with broad applications of intelligent technologies; among those popular ones are intelligent manufacturing and autonomous products like vehicles and robotic systems. In both cases, autonomous operations are at the center of the stage, in which appropriate sensing and perception play critical roles. Indeed, recent advances in sensing and perception technologies have produced exciting new ideas in facilitating autonomous manufacturing and/or robotic vehicular systems. These technologies will potentially evolve with more and more ‘smart functions’ and move manufacturing and robotic systems from single structured operation to sensing/perception-based self-governed yet collaborative multi-system operations. This Focused Section is dedicated to new progresses in modeling, design, control, communication, and implementation of sensing and perception systems for autonomous and/or networked robotics, and intends to provide the state-of-the-art update of research fronts. The Focused Section consists of six research papers covering detection of human motion (Jiang, et al), vision based pose measurement (Zhang, et al,), ream-time object detection and tracking (Benabderrahmane), 3-D map reconstruction (Turan, et al; Landsiedel and Wollherr), and vision based endoscopic capsule robot (Turan, et al).

Guest editorial: focused section on human‑centered robotics / Devasia, Santosh; Chien Chern Cheah, ·; Pellicciari, Marcello; Peruzzini, Margherita. - In: INTERNATIONAL JOURNAL OF INTELLIGENT ROBOTICS AND APPLICATIONS. - ISSN 2366-598X. - 2:2(2018), pp. 133-135. [10.1007/s41315-018-0058-6]

Guest editorial: focused section on human‑centered robotics

· Marcello Pellicciari;· Margherita Peruzzini
2018

Abstract

Next generation of industrial revolution will be featured with broad applications of intelligent technologies; among those popular ones are intelligent manufacturing and autonomous products like vehicles and robotic systems. In both cases, autonomous operations are at the center of the stage, in which appropriate sensing and perception play critical roles. Indeed, recent advances in sensing and perception technologies have produced exciting new ideas in facilitating autonomous manufacturing and/or robotic vehicular systems. These technologies will potentially evolve with more and more ‘smart functions’ and move manufacturing and robotic systems from single structured operation to sensing/perception-based self-governed yet collaborative multi-system operations. This Focused Section is dedicated to new progresses in modeling, design, control, communication, and implementation of sensing and perception systems for autonomous and/or networked robotics, and intends to provide the state-of-the-art update of research fronts. The Focused Section consists of six research papers covering detection of human motion (Jiang, et al), vision based pose measurement (Zhang, et al,), ream-time object detection and tracking (Benabderrahmane), 3-D map reconstruction (Turan, et al; Landsiedel and Wollherr), and vision based endoscopic capsule robot (Turan, et al).
2018
2
2
133
135
Guest editorial: focused section on human‑centered robotics / Devasia, Santosh; Chien Chern Cheah, ·; Pellicciari, Marcello; Peruzzini, Margherita. - In: INTERNATIONAL JOURNAL OF INTELLIGENT ROBOTICS AND APPLICATIONS. - ISSN 2366-598X. - 2:2(2018), pp. 133-135. [10.1007/s41315-018-0058-6]
Devasia, Santosh; Chien Chern Cheah, ·; Pellicciari, Marcello; Peruzzini, Margherita
File in questo prodotto:
File Dimensione Formato  
10.1007-s41315-018-0058-6.pdf

Accesso riservato

Tipologia: Versione pubblicata dall'editore
Dimensione 768.26 kB
Formato Adobe PDF
768.26 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1161352
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact