Paterson, Justin ORCID: https://orcid.org/0000-0001-7822-319X and Visser, Andy (2024) Feel the future – Touching Sound. In: Innovation in Music conference, 14-16 Jun 2024, Oslo, Norway. (Unpublished)
Full text not available from this repository.Abstract
Combining haptics and audio represents an emergent and flexible bi-modality. Numerous musical applications have been devised around this, and although force feedback has been deployed, most are vibrotactile. Applications range from assistive to performative, and there have been various explorations around the development of production tools, but collaborative musical performance supported by force feedback is not yet widely explored, and that will be the focus of this presentation.
The presentation will first offer perspectives into how haptic audio might yet come to be part of our future lives. It will then present research into ongoing development of a novel music-performance environment. In this, an interactive music track is modelled as a flat deformable abstraction in extended reality. Physics modelling and neural networks are utilized to exert control over it and provide both visual and haptic-force feedback on user interaction. Networked computers hosting a hybrid software configuration allow multiple users to engage with this abstraction remotely – using haptic robotic controllers to deliver a collaborative music performance.
The use of haptics appears to offer an enhanced level of control compared to traditional virtual-reality wands, thus facilitating precise performative gestures that may not otherwise be possible. The presentation will include a breakdown of the system architecture required to deliver this, and demonstrate a performance using it.
Item Type: | Conference or Workshop Item (Lecture) |
---|---|
Subjects: | Music |
Related URLs: | |
Depositing User: | Justin Paterson |
Date Deposited: | 10 Oct 2024 10:06 |
Last Modified: | 10 Oct 2024 10:06 |
URI: | https://repository.uwl.ac.uk/id/eprint/12758 |
Actions (login required)
View Item |