Received 4 March 2011 Accepted 14 June 2012 Available online 31 July 2012 Keywords: Collaborative design Virtual environments Metrics Design review Benchmarking method 1. Introduction Market globalisation, short delivery times and the rapid evolu- tion of customer requirements highly influence how the product design process must be performed. It is becoming increasingly important to consider different competencies in the early process phases, which implies organising the cooperative work of a geo- graphically distributed team. Generally, team configuration dynamically changes based on specific objectives. To manage this flexible cooperation, a new approach called Collaborative Product Design (CPD) has been developed. In this case, people belong to a virtual design team. However, traditional design tools have not generally been conceived to support the collaborative teamwork in a distributed design space. New technologies have recently emerged that allow creating Virtual Design Environments (VDEs) to facilitate CPD through easy interaction and data sharing among all participants. Though several collaborative software applications ⇑ Corresponding author. Tel.: +39 71 2204969/2204790; fax: +39 71 2204801. E-mail address: email@example.com (M. Germani). URL: http://www.univpm.it (M. Germani). 1474-0346/$ - see front matter Ó 2012 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.aei.2012.06.003 abstract This paper considers applying novel Virtual Environments (VEs) in collaborative product design, focusing on reviewing activities. Companies are usually anchored to commercial ICT tools, which are mature and reliable. However, two main problems emerge: the difficulty in selecting the most suitable tools for specific purposes and the complexity in evaluating the impact that using technology has on design col- laboration. The present work aims to face both aspects by proposing a structured benchmarking method based on expert judgements and defining a set of benchmarking weights based on experimental tests. The method considers both human–human interaction and teamwork-related aspects. A subsequent evaluation protocol considering both process efficiency and human–human interaction allows a closed-loop verification process. Pilot projects evaluate different technologies, and the benchmarking weights are verified and adjusted for more reliable system assessment. This paper focuses on synchro- nous and remote design review activities: three different tools have been compared according to expert judgements. The two best performing tools have been implemented as pilot projects within real indus- trial chains. Design collaboration has been assessed by considering both process performance and human–human interaction quality, as well as benchmarking results validated by indicating some correc- tive actions. The final benchmarking weights can thus be further adopted for an agile system benchmark in synchronous and remote design. The main findings suggest defining both an innovative process to ver- ify the expert benchmark reliability and a trusty benchmarking method to evaluate tools for synchronous and remote design without experimental testing. Furthermore, the proposed method has a general valid- ity and can be properly set for different collaborative dimensions.
|Data di pubblicazione:||2012|
|Titolo:||An approach to assessing virtual environments for synchronous and remote collaborative design|
|Autore/i:||Germani, M; Mengoni, M; Peruzzini, Margherita|
|Digital Object Identifier (DOI):||http://dx.doi.org/10.1016/j.aei.2012.06.003|
|Codice identificativo ISI:||WOS:000311533200015|
|Codice identificativo Scopus:||2-s2.0-84867897591|
|Citazione:||An approach to assessing virtual environments for synchronous and remote collaborative design / Germani, M; Mengoni, M; Peruzzini, Margherita. - In: ADVANCED ENGINEERING INFORMATICS. - ISSN 1474-0346. - ELETTRONICO. - 26:4(2012), pp. 793-813.|
|Tipologia||Articolo su rivista|
File in questo prodotto:
I documenti presenti in Iris Unimore sono rilasciati con licenza Creative Commons Attribuzione - Non commerciale - Non opere derivate 3.0 Italia, salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris