Select Publications

R. Graham. “To Notice and Remember: a collaborative work exploring the application of historical data in virtual reality.” Artist Statement. In Leonardo Music Journal (Issue 27), MIT Press. Special Issue on History and Memory. Late 2017.

R. Graham, W. Brent, B. Bridges, C. Manzione. “Exploring Pitch and Timbre Spaces in VR: Virtual Reality as an Incubator for Performance Systems Design” in Proceedings of the International Conference on New Interfaces for Musical Expression. Aalborg University. Copenhagen, Denmark. May 2017.

R. Graham and B.Bridges. “Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based relational dynamics for organizing space and timbre in performance using physical modeling.” In Emille: Journal of the Korea Electro-Acoustic Music Society. October 2016.

R. Graham and S. Cluett. “The Soundfield as Sound Object: Virtual Reality Environments as a Three-Dimensional Canvas for Music Composition” in the Proceeding of the Audio Engineering Society International Conference on Audio for Virtual and Augmented Reality. Paper Number: 7-3. AES E-Lib: 18510. Los Angeles, USA. October 2016.

R. Graham. “High-Density Loudspeaker Arrays as a Performance Environment.” Artist Statement / Letter. In Computer Music Journal. Special Issue on Computer Music for High-Density Loudspeaker Arrays. MIT Press. 40:4. Winter 2016.

Strategies for Spatial Music Performance: The Practicalities and Aesthetics of Responsive Systems Design
Divergence Press / Centre for Research in New Music at the University of Huddersfield (ISSN 2052-3467) · Jan 27, 2015

This article will explore practical and aesthetic questions concerning spatial music performance by interrogating new developments within an emerging hyperinstrumental practice. The performance system is based on an electric guitar with individuated audio outputs per string and multichannel loudspeaker array. A series of spatial music mapping strategies will explore in-kind relationships between a formal melodic syntax model and an ecological flocking simulator, exploiting broader notions of embodiment underpinning the metaphorical basis for the experience and understanding of musical structure. The extension and refinement of this system has been based on a combination of practice-led and theoretical developments. The resulting mapping strategies will forge new gestural narratives between physical and figurative gestural planes, culminating in a responsive, bodily based, and immersive spatial music performance practice. The operation of the performance system is discussed in relation to supporting audiovisual materials.
Gesture and Embodied Metaphor in Spatial Music Performance Systems Design


This paper describes the design, theoretical underpinnings and development of a hyper-instrumental performance system driven by gestural data obtained from an electric guitar. The system combines a multichannel audio feed (parsed for its pitch contour, spectral content and note inter–onset time data) with motion tracking of the performer’s larger–scale bodily movements using a Microsoft Xbox Kinect sensor. These gestural materials provide the basis for the system’s musical mapping strategies, informed by an integration of embodied cognitive models with electroacoustic/electronic music theory (specifically, Smalley’s spectromorphology). The performance system’s sound processing is further animated using the boids flocking algorithm by Reynolds. This provides an embodied/ecological base for connecting Lerdahl’s spatial and syntactical models of tonal harmony with sound spatialization and textural processing. Through this work, we aim to advance broadly applicable performance gesture ecologies, providing typologies that facilitate creative (but still coherent) mappings from physical and figurative performance gestures to spatial and textural structures.

 

Managing Musical Complexity with Embodied Metaphors
Louisiana State University, Baton Rouge, USA (ISSN 2220-4806)

This paper presents the ideas and mapping strategies behind a performance system that uses a combination of motion tracking and feature extraction tools to manage complex multichannel audio materials for real-time music composition. The use of embodied metaphors within these mappings is seen as a means of managing the complexity of a musical performance across multiple modalities. In particular, we will investigate how these mapping strategies may facilitate the creation of performance systems whose accessibility and richness are enhanced by common integrating bases. A key focus for this work is the investigation of the embodied image schema theories of Lakoff and Johnson alongside similarly embodied metaphorical models within Smalley’s influential theory of electroacoustic music (spectromorphology). These metaphors will be investigated for their use as grounding structural components and dynamics for creative practices and musical interaction design. We argue that pairing metaphorical models of forces with environmental forms may have particular significance for the design of complex mappings for digital music performance.
Mapping and Meaning: Embodied Metaphors and Non–Localised Structures in Performance System Design



This paper explores the application of gestural structures and embodied image schemas in the design of control mappings and interpretative layers for a spatial music performance system. In doing so, it advances a design that maps structural features derived from musical performance gestures to various aspects of spatial and timbral sound processing. An analysis of a system by one of the co-authors will be undertaken, in addition to discussion of wider implications for performance system designs based upon these theoretical perspectives. The importance of image schemas in such a model is their potential utility in contributing to the development of richer, yet coherent, mappings that go beyond simpler localized transduction–based models to encompass a variety of less localized output modalities.


SEPTAR: Audio Breakout Circuit for Multichannel Guitar


Multichannel (or divided) audio pickups are becoming increasingly ubiquitous in electric guitar and computer music communities. These systems allow performers to access signals for each string of their instrument independently and concurrently in real-time creative practice. This paper presents an open-source audio breakout circuit that provides independent audio outputs per string of any chordophone (stringed instrument) that is fitted with a multichannel audio pickup system. The following sections include a brief historical contextualization and discussion on the significance of multichannel audio technology in instrumental guitar music, an overview of our proposed impedance matching circuit for piezoelectric-based audio pickups, and a presentation of a new open-source PCB design (SEPTAR V2) that includes a mountable 13-pin DIN connection to improve compatibility with commercial multichannel pickup systems. This paper will also include a short summary of the potential creative applications and perceptual implications of this multichannel technology when used in creative practice.