[1] D. H. Holding and A. W. Macrae, “Guidance, Restriction and Knowledge of Results,” Ergonomics, vol. 7, no. 3, pp. 289–295, 1964, doi: 10.1080/00140136408930748.
[2] A. Çamcı, M. Vilaplana, and L. Wang, “Exploring the Affordances of VR for Musical Interaction Design with VIMEs,” in Proceedings of NIME 2020 - The 20th International Conference of New Interfaces for Musical Expression, Birmingham, UK, 2020, pp. 121–126, [Online]. Available: https://www.nime.org/archives/.
[3] V. de las Pozas, “Semi-Automated Mappings for Object-Manipulating Gestural Control of Electronic Music,” in Proceedings of NIME 2020 - The 20th International Conference of New Interfaces for Musical Expression, Birmingham, UK, 2020, pp. 631–634, [Online]. Available: https://www.nime.org/archives/.
[4] S. Agrawal, A. Simon, S. Bech, K. Bærentsen, and S. Forchhammer, “Defining Immersion: Literature Review and Implications for Research on Immersive Audiovisual Experiences,” presented at the Audio Engineering Society Convention 147, Oct. 2019, Accessed: Jan. 05, 2021. [Online]. Available: https://www.aes.org/e-lib/browse.cfm?elib=20648.
[5] G. Grindlay, “Haptic Guidance Benefits Musical Motor Learning,” in 2008 Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2008, pp. 397–404, doi: 10.1109/HAPTICS.2008.4479984.
[6] Y. Zhang, Y. Li, D. Chin, and G. Xia, “Adaptive Multimodal Music Learning via Interactive Haptic Instrument,” in Proceedings of the International Conference on New Interfaces for Musical Expression, Porto Alegre, Brazil, Jun. 2019, pp. 140–145, doi: 10.5281/zenodo.3672900.
[7] G. Young, D. Murphy, and J. Weeter, “A Qualitative Analysis of Haptic Feedback in Music Focused Exercises,” in Proceedings of the International Conference on New Interfaces for Musical Expression, Copenhagen, Denmark, Jun. 2017, pp. 204–209, doi: 10.5281/zenodo.1176222.
[8] D. Chin, Y. Zhang, T. Zhang, J. Zhao, and G. G. Xia, “Interactive Rainbow Score: A Visual-centered Multimodal Flute Tutoring System,” in Proceedings of NIME 2020 - The 20th International Conference of New Interfaces for Musical Expression, Birmingham, UK, 2020, pp. 208–213, [Online]. Available: https://www.nime.org/archives/.
[9] K. Ashimori and H. Igarashi, “Complemental Learning Assist for Musical Instruments by Haptic Presentation,” in 2018 IEEE 15th International Workshop on Advanced Motion Control (AMC), 2018, pp. 175–180, doi: 10.1109/AMC.2019.8371083.
[10] TESLASUIT, “VR Glove by TESLASUIT,” TESLASUIT, Jan. 24, 2020. https://teslasuit.io/blog/vr-glove-by-teslasuit/ (accessed Jan. 18, 2021).
[11] L. Chu, “Haptic feedback in computer music performance,” in Proceedings of ICMC, Hong Kong, Aug. 1996, vol. 96, pp. 57–58, [Online]. Available: https://www.semanticscholar.org/paper/Haptic-Feedback-in-Computer-Music-Performance-Chu/a2b552ddeafdcc26e6333e98ab6574360a8fe46b?p2df.
[12] S. Rimell, D. M. Howard, A. M. Tyrrell, R. Kirk, and A. Hunt, “Cymatic. Restoring the Physical Manifestation of Digital Sound Using Haptic Interfaces to Control a New Computer Based Musical Instrument.,” presented at the International Computer Music Conference, Gothenburg, Sweden, Sep. 2002, Accessed: Jan. 20, 2021. [Online]. Available: https://dblp.org/db/conf/icmc/icmc2002.html.
[13] I. Hwang, H. Son, and J. R. Kim, “AirPiano: Enhancing music playing experience in virtual reality with mid-air haptic feedback,” in 2017 IEEE World Haptics Conference (WHC), 2017, pp. 213–218.
[14] Ultraleap, “Ultraleap,” 2021. https://www.ultraleap.com/ (accessed Jan. 16, 2021).
[15] E. Frid, J. Moll, R. Bresin, and E.-L. Sallnäs Pysander, “Haptic feedback combined with movement sonification using a friction sound improves task performance in a virtual throwing task,” J. Multimodal User Interfaces, vol. 13, no. 4, pp. 279–290, Dec. 2019, doi: 10.1007/s12193-018-0264-4.
[16] S. Serafin, C. Erkut, J. Kojs, N. C. Nilsson, and R. Nordahl, “Virtual Reality Musical Instruments: State of the Art, Design Principles, and Future Directions,” Comput. Music J., Dec. 2016, doi: 10.1162/COMJ_a_00372.
[17] E. Gunther and S. O’Modhrain, “Cutaneous Grooves: Composing for the Sense of Touch,” J. New Music Res., vol. 32, no. 4, pp. 369–381, Dec. 2003, doi: 10.1076/jnmr.32.4.369.18856.
[18] L. Boer, B. Cahill, and A. Vallgårda, “The Hedonic Haptics Player: A Wearable Device to Experience Vibrotactile Compositions,” in Proceedings of the 2017 ACM Conference Companion Publication on Designing Interactive Systems, New York, NY, USA, 2017, pp. 297–300, doi: 10.1145/3064857.3079178.
[19] B. D. Adelstein, D. R. Begault, M. R. Anderson, and E. M. Wenzel, “Sensitivity to Haptic-Audio Asynchrony,” in Proceedings of the 5th International Conference on Multimodal Interfaces, New York, NY, USA, 2003, pp. 73–76, doi: 10.1145/958432.958448.
[20] T. Mudd, “Feeling for Sound: Mapping Sonic Data to Haptic Perceptions,” in Proceedings of the International Conference on New Interfaces for Musical Expression, Jun. 2013, pp. 369–372, doi: 10.5281/zenodo.1293003.
[21] G. Ren, S. Wei, E. O’Neill, and F. Chen, “Towards the Design of Effective Haptic and Audio Displays for Augmented Reality and Mixed Reality Applications,” Adv. Multimed., vol. 2018, pp. 1–11, Jul. 2018, doi: 10.1155/2018/4517150.
[22] M. D. Fletcher and J. Zgheib, “Haptic sound-localisation for use in cochlear implant and hearing-aid users,” Sci. Rep., vol. 10, no. 1, Art. no. 1, Aug. 2020, doi: 10.1038/s41598-020-70379-2.
[23] E. Berdahl and A. Kontogeorgakopoulos, “The FireFader: Simple, Open-Source, and Reconfigurable Haptic Force Feedback for Musicians,” Comput. Music J., vol. 37, no. 1, pp. 23–34, 2013, doi: 10.1162/COMJ_a_00166.
[24] D. Quiroz and D. Martin, “Exploratory research into the suitability of various 3D input devices for an immersive mixing task. Part II.,” presented at the Audio Engineering Society Convention 149, Oct. 2020, [Online]. Available: http://www.aes.org/e-lib/browse.cfm?elib=20979.
[25] C. Dewey and J. Wakefield, “Exploring the Container Metaphor for Equalisation Manipulation,” in The International Conference on New Interfaces for Musical Expression, UFRGS, Porto Alegre, Brazil, Jun. 2019, pp. 130–133, doi: http://doi.org/10.5281/zenodo.3672892.
[26] B. Di Donato, C. Dewey, and T. Michailidis, “Human-Sound Interaction: Towards a Human-Centred Sonic Interaction Design Approach,” New York, NY, USA, 2020, doi: 10.1145/3401956.3404233.
[27] J. L. Paterson, “The preset is dead; long live the preset,” presented at the Audio Engineering Society Convention 130, London, UK, May 2011, Accessed: Mar. 19, 2013. [Online]. Available: http://www.aes.org/e-lib/browse.cfm?elib=16569.
[28] M. Walther-Hansen, “Journal on the Art of Record Production » New and Old User Interface Metaphors in Music Production,” 2017. https://www.arpjournal.com/asarpwp/new-and-old-user-interface-metaphors-in-music-production/ (accessed Jan. 06, 2021).
[29] D. Rocchesso et al., “Sonic Interaction Design: Sound, Information and Experience,” in CHI ’08 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA, 2008, pp. 3969–3972, doi: 10.1145/1358628.1358969.
[30] A. Freed and A. Schmeder, “Features and Future of Open Sound Control version 1.1 for NIME,” in Proceedings of the International Conference on New Interfaces for Musical Expression, Jun. 2009, pp. 116–120, doi: 10.5281/zenodo.1177517.
[31] 3D Systems, Inc., “Phantom Premium,” 3D Systems, 2021. https://uk.3dsystems.com/haptics-devices/3d-systems-phantom-premium/specifications (accessed Jan. 19, 2021).
[32] A. Bell, E. Hein, and J. Ratcliffe, “Beyond Skeuomorphism: The Evolution of Music Production Software User Interface Metaphors,” J. Art Rec. Prod., no. 9, Apr. 2015, Accessed: Jan. 13, 2021. [Online]. Available: https://www.arpjournal.com/asarpwp/beyond-skeuomorphism-the-evolution-of-music-production-software-user-interface-metaphors-2/.
[33] O. Tokatli et al., “A Classroom Deployment of a Haptic System for Learning Cell Biology,” in International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, 2018, pp. 379–389.
[34] J. R. Blum et al., “Getting Your Hands Dirty Outside the Lab: A Practical Primer for Conducting Wearable Vibrotactile Haptics Research,” IEEE Trans. Haptics, vol. 12, no. 3, pp. 232–246, Jul. 2019, doi: 10.1109/TOH.2019.2930608.
[35] G. Le Vaillant, T. Dutoit, and R. Giot, “Analytic vs. holistic approaches for the live search of sound presets using graphical interpolation,” in Proceedings of NIME 2020 - The 20th International Conference of New Interfaces for Musical Expression, Birmingham, UK, Jul. 2020, pp. 227–232, [Online]. Available: https://www.nime.org/archives/.
[36] S. Wall and W. Harwin, “Quantification of the effects of haptic feedback during a motor skills task in a simulated environment,” in Proc. Second PHANToM users research symposium, Zurich, Switzerland, 2000, pp. 61–69.
[37] K. Chun, B. Verplank, F. Barbagli, and K. Salisbury, “Evaluating haptics and 3D stereo displays using Fitts’ law,” in The 3rd IEEE International Workshop on Haptic, Audio and Visual Environments and Their Applications, 2004, pp. 53–58, doi: 10.1109/HAVE.2004.1391881.
[38] J. Mycroft and J. L. Paterson, “Activity flow in music equalization: The cognitive and creative implications of interface design,” presented at the Audio Engineering Society Convention 130, May 2011, Accessed: Mar. 19, 2013. [Online]. Available: http://www.aes.org/e-lib/browse.cfm?elib=16568.
[39] J. Francombe, R. Mason, M. Dewhirst, and S. Bech, “Elicitation of attributes for the evaluation of audio-on-audio interference,” J. Acoust. Soc. Am., vol. 136, no. 5, pp. 2630–2641, 2014.
[40] B. De Man and J. D. Reiss, “APE: Audio Perceptual Evaluation Toolbox for MATLAB,” Apr. 2014, [Online]. Available: http://www.aes.org/e-lib/browse.cfm?elib=17160.
[41] British Psychological Society, “Morrisby Profile | PTC,” British Psychological Society, 2021. https://ptc.bps.org.uk/test-review/morrisby-profile (accessed Jan. 21, 2021).
[42] igroup, “igroup Presence Questionnaire (IPQ),” igroup.org – project consortium, 2016. http://www.igroup.org/pq/ipq/index.php (accessed Jan. 14, 2021).
[43] J. A. Horne and O. Ostberg, “A self-assessment questionnaire to determine morningness-eveningness in human circadian rhythms,” Int. J. Chronobiol., vol. 4, no. 2, pp. 97–110, 1976.
[44] M. J. Sullivan, H. Adams, S. Horan, D. Maher, D. Boland, and R. Gross, “Injustice Experience Questionnaire (IEQ),” eProvide, 2016. https://eprovide.mapi-trust.org/instruments/injustice-experience-questionnaire (accessed Jan. 14, 2021).
[45] S. Rigby and R. Ryan, “The Player Experience of Need Satisfaction (PENS),” 2007. https://immersyve.com/white-paper-the-player-experience-of-need-satisfaction-pens-2007/ (accessed Jan. 14, 2021).
[46] N. Gillian, “Gesture Recognition Toolkit (GRT),” GitHub, 2019. https://github.com/nickgillian/grt (accessed Jan. 14, 2021).
[47] N. Gillian, “ml.lib,” Github. http://irllabs.github.io/ml-lib/#class-reference-mlmlp (accessed Jan. 14, 2021).
[48] K. Hagelsteen, R. Johansson, M. Ekelund, A. Bergenfelz, and M. Anderberg, “Performance and perception of haptic feedback in a laparoscopic 3D virtual reality simulator,” Minim. Invasive Ther. Allied Technol., vol. 28, no. 5, pp. 309–316, Sep. 2019, doi: 10.1080/13645706.2018.1539012.
[49] A. Torabi, M. Khadem, K. Zareinia, G. R. Sutherland, and M. Tavakoli, “Application of a Redundant Haptic Interface in Enhancing Soft-Tissue Stiffness Discrimination,” IEEE Robot. Autom. Lett., vol. 4, no. 2, pp. 1037–1044, Apr. 2019, doi: 10.1109/LRA.2019.2893606.
[50] B. De Man, R. Stables, and J. D. Reiss, Intelligent Music Production, 1st edition. New York: Routledge, 2019.
[51] G. Fazekas and M. Sandler, “The Studio Ontology Framework.,” Jan. 2011, pp. 471–476.
[52] B. Albert et al., “A Smart System for Haptic Quality Control - Introducing an Ontological Representation of Sensory Perception Knowledge,” Nov. 2016, pp. 21–30, doi: 10.5220/0006048300210030.
[53] M. Cartwright, B. Pardo, and J. Reiss, “MIXPLORATION: Rethinking the Audio Mixer Interface,” in Proceedings of the 19th International Conference on Intelligent User Interfaces, New York, NY, USA, 2014, pp. 365–370, doi: 10.1145/2557500.2557530.