About the Sonic Interaction Design Lab
The Sonic Interaction Design Lab at the Centre for Digital Music explores through new ways
of encountering sound from interactive art to real time data sonification.
Sonic Interaction Design is about designing and evaluating interactive systems that prioritise sound over vision. This is especially important in the post-screen world we live in (Bryan-Kinns, 2017) where the pervasive, affective, and co-temporal features of sound contribute to the design of:
- Mobile and wearable interaction
- System for creative, expressive, artistic interaction
- Tangible and physical interaction i.e. objects which do not have screens
- Internet of Things e.g. smart appliances such as Alexa
- Eyes-free interaction e.g. whilst attending to another task such as driving
- Accessible interaction
The SID Lab focusses on researching the use of sound as the primary display medium. The specialisation is using sounds as a non-verbal display medium.
As part of the Centre for Digital Music Platform grant we have funded commissions on
interactive sound for example, exploring the interaction of pianos, algorithms, physical
intervention, and performance.
Current Sonic Interaction Design research at C4DM can be categorised broadly into research into i) design methodologies and ii) evaluation methodologies as illustrated by the international journal papers below.
Using sound as a design focus, provocation, and facilitator in the Interaction Design process to drive creative, affective, and accessible Interaction Design ideation. Examples of using sound in design at QMUL include:
- Sonic Interaction Design in a Co-Creation Design Methodology. In this case developing methodologies to use sound in design practice as an approach to engaging people together in design across cultures (Wang et al., 2016).
- Sonic Interaction Design in Affective and Artistic Design Methodologies. For example, studying the role of sound art in practices of inter-disciplinary design (Murray-Browne et al., 2013), and the affective use of sound in design methodologies for performative interaction (Sheridan & Bryan-Kinns, 2008).
- Sonic Interaction Design for Accessible Design Methodologies. For example, exploring methodologies which use sound in participatory design to include visually impaired populations (Metatla et al., 2016; Metatla et al., 2014).
Rigorous evaluation methodologies to understand our responses to interactive sound systems are built on techniques from Human Computer Interaction. Research in this area explores how existing HCI techniques can be applied to systems which predominantly feature interactive sound, and what new techniques need to be developed to support its evaluation. Examples include:
- Evaluation of engagement with Creative Activities with Sonic Interaction Design systems. For example, evaluating audience and performer creativity in interactive music performances (Wu et al., 2017), evaluation of collaborative creativity with mobile music apps (Bryan-Kinns, 2012a, 2012b, 2011; Bryan-Kinns & Hamilton, 2009), and evaluation of expert musician perceptions of creativity with sonic interaction (Stowell et al., 2009).
- Evaluation of responses to Affective and Artistic use of Sonic Interaction Design. For example, evaluating how wearables could be used to enhance mood music in films (Mazzoni and Bryan-Kinns, 2016), evaluation of affect in collaborative interactive music making (Morgan et al., 2015), and evaluating affective response to public interactive art (Marshall et al., 2010; Sheridan et al., 2007).
- Evaluation of the Accessible use of Sonic Interaction Design. For example, examining the effectiveness of using interactive sound to access and manipulate diagrams (Metatla et al., 2016; Metatla et al., 2012a; Metatla et al., 2012b).
The lab is led by Dr. Nick Bryan-Kinns with Dr. Tony Stockman. Projects in the area include:
- Interactive real-time musical systems
- Interactive soundscapes
- Understanding collaborative music making
- Interactive data sonification
- Audio games
- Methods for designing and evaluating auditory displays
- Distributed music making systems
- Musical audio analysis for real-time interaction
- Interaction design for musical composition
- Visualizing structured data about music
- Auditory graphs
- Cross-modal interaction
- Auditory overviews
- Spatialised sound composition
- Collaborative auditory displays and sonification
Key conferences, networks, and journals in the area include: