The use of visual and auditory feedback for assembly task performance in a virtual environment

Zhang, Ying ORCID: https://orcid.org/0000-0002-6669-1671, Fernando, Terrence, Sotudeh, Reza and Xiao, Hannan (2005) The use of visual and auditory feedback for assembly task performance in a virtual environment. In: The 9th International Conference on Information Visualisation (IV'05), 6-8 July 2005, London, UK.

Full text not available from this repository. (Request a copy)

Abstract

This paper presents our creation and evaluation of multi-modal interface for a virtual assembly environment. It involves implementing an assembly simulation environment with multi-sensory feedback (visual and auditory), and evaluating the effects of multimodal feedback on assembly task performance. This virtual environment experimental platform brought together complex technologies such as constraint-based assembly simulation, optical motion tracking technology, and real-time 3D sound generation technology around a virtual reality workbench and a common software platform. A peg-in-a-hole and a Sener electronic box assembly tasks have been used as the task cases to perform human factor experiments, using sixteen subjects. Both objective performance data (task completion time, and human performance error rates) and subjective opinions (questionnaires) have been gathered from this experiment.

Item Type: Conference or Workshop Item (Paper)
ISSN: 1550-6037
ISBN: 0769523978
Identifier: 10.1109/IV.2005.127
Identifier: 10.1109/IV.2005.127
Keywords: Virtual Environment, Assembly Simulation, Multi-sensory Feedback, Usability, Task Performance.
Subjects: Computing > Innovation and user experience > Computing interaction design
Computing > Innovation and user experience > Usability
Computing
Related URLs:
Depositing User: Ying Zhang
Date Deposited: 19 Jul 2019 13:18
Last Modified: 28 Aug 2021 07:11
URI: https://repository.uwl.ac.uk/id/eprint/6253

Actions (login required)

View Item View Item

Menu