Duncan Werner, Bruce Wiggins and Emma Fitzmaurice

Academic paper: Innovation in Music: University of West London: December 2019

Development of an Ambisonic Guitar System

GASP: Guitars with Ambisonic Spatial Performance

1. Introduction

The term sound spatialisation indicates a group of techniques for organizing and manipulating the spatial projection and movement of sound in a physical or virtual listening environment (Valiquet, 2011). In addition to pitch, timbre, intensity and duration, an understanding of space is a significant musical parameter and as such, spatial sound controllers should be regarded as instruments of musical expression (Pysievficz and Weinzierl, 2016); spatialiser design should therefore include immediacy, liveness, and learnability. Perez-Lopez (2015) identifies several design considerations for spatialisation systems, including: Is the performer exclusively controlling spatial parameters (in contrast to one who controls both spatialisation and sound generation)? What is the required level of expertise for expressivity and virtuosity? Is the spatial control made by an individual, or shared by a group of performers? Does the system provide a graphical user interface for real time visual feedback? Many of these observations and questions have informed our approach to the development of the GASP project.

2. GASP System Overview

An Ambisonic Guitar System has been devised and realised, with the project name GASP: ‘Guitars with Ambisonic Spatial Performance’. GASP is an ongoing University of Derby research project, where our interest in Ambisonic algorithmic research and guitar sound production is combined with off the shelf hardware and bespoke software to create an Ambisonic based immersive guitar sound system. It is an innovative audio project, fusing the musical with the technical, combining individual string timbralisation with Ambisonic immersive sound. GASP also extends into an artistic musical project, where ‘space’ becomes a performance parameter, providing new experimental immersive sound production techniques for the guitarist and music producer. The project was initially research for post-production; however, the system can also be used within the live performance domain in large format theatre/concert systems. Additionally, the audio may be downmixed to work with traditional stereo sound installations or personal playback over headphones i.e. binaural or google 360° virtual reality (VR) sound via web streaming.

The GASP project is a guitar based 2D Ambisonic instrument and performance system. At the University of Derby, work began using hexaphonic pickups in 2010 (for example, see Randell, 2011). Our current configuration, as shown in Figure 1, consists of two interconnected computers (Mac Pro and iMac), three visual monitors, eight loudspeakers, audio interfacing, and three guitars (two Strat type and an electro-acoustic) with various hex/multichannel pickups fitted.

Figure 1 – GASP hardware photograph (excluding Ambisonic speaker system) Copyright R. Johnson 2019

3. GASP System Detail

The key features of the GASP system shown in Figure 2 and Figure 3 comprise of:

  • Multichannel guitar pickups to facilitate the processing of individual strings independently for timbre, spatial location and other production processes.
  • An eight-speaker circular array which supports a 2D, 360° spatial representation, with a choice of first, second or third order Ambisonic encoding/decoding.
  • Timbral effects, which are achieved with commercial sound processing software, configured for individual string timbralisation.
  • Spatial positioning and dynamic movement are achieved using bespoke WigWare plug-ins; auditioned over an array of circular loudspeakers or using binaural decoder output on headphones.
  • Arpeggiation effects using audio gate switching on individual strings using the ‘Guitarpeggiator’; a means of creating arpeggiation like effects, such that simple or complex polyrhythms can be realised for recording or live performance.
  • A range of GASP Auditory Scenes, which combine spatial, timbral, and arpeggiation presets.
This image has an empty alt attribute; its file name is Figure-2-1024x676.png

Figure 2 – GASP system Signal Flow Overview

This image has an empty alt attribute; its file name is Figure-3-724x1024.png

Figure 3 – GASP lab interconnections overview. Courtesy of H. Dale 2019

3.1. Guitars

The GASP guitars have multichannel pickups (aka hex, hexaphonic or divided pickups) installed. Two of the guitars, a black Yamaha APX500 electro-acoustic and maroon Mexican Fender Stratocaster are retrofitted with Ubertar hex passive pickups (Rubenstein, 2019). A third guitar, a sage green Mexican Fender Stratocaster with pau ferro neck, is retrofitted with Cycfi Nu-Series Modular Active Pickups (Cycfi Research, 2019). The Fender Stratocasters were chosen as they enable straightforward installation of the multichannel pickup as it simply replaces the bridge pickup. This positioning has the advantage of generating minimal cross-over into the pickup area of the neighbouring string when string bending occurs, although there are tonal considerations for this positioning which produce a slightly thinner timbre than other pickup locations. The exception to this is the Yamaha electro-acoustic which has an Ubertar hex pickup mounted across the sound hole. Due to their modular design, the Cycfi Nu-Series Modular Active pick-ups are available for 6, 7, 8 (and upwards) string guitars. In comparing our Ubertar with Cycfi pickups it is noted there are some variations with consistency of output; the active Cycfi pickups provides a more consistent output level across all strings with a high output and transparent timbral clarity, whereas the individual outputs of the passive Ubertar pickups require some level balancing before further processing. Our current system uses a Focusrite Liquid Saffire 56 to route the individual signal from each string to the computer for processing.

3.2. Spatialisation

Spatially, the system is based around Ambisonics, pioneered by Michael Gerzon in the 1970s (Gerzon, 1974a) as an improvement to the, then, state of the art system, Quadraphonics (Gerzon, 1974b). It was important for the system to be both spatially accurate and flexible in that the presentation format in terms of number of, or arrangement of loudspeakers was not fixed and that it allowed for playback over headphones. Ambisonics is a system based around the spherical harmonic decomposition of a sound field in two, or three dimensions and the reconstruction of this sound field using loudspeakers or headphones. Headphone playback makes use of head related transfer functions, with or without head-tracking. The flexibility of Ambisonics made it ideal for this project. More recently, Ambisonics has been utilised as one of the standard playback formats for virtual reality and 360 videos, further expanding the possibilities of the system (Wiggins, 2017). The GASP system currently operates in two dimensions (using circular, rather than spherical harmonics) and uses panners of up to 3rd order (although we have software and panners written up to 35th order). With increasing Ambisonic order, the accuracy of the spatial reproduction improves but requires more loudspeakers and channels per track. 3rd order requires 8 speakers arranged in an octagon as a minimum and needs 16 channels per track for the encoding for full 3D (7 channels are all that’s necessary for 2D reproduction at 3rd order, but the full 16 channels are used in the system to allow for future expansion to 3D). Currently, the GASP system has the following playback modes:

  • Minimum of 8 speakers arranged in an octagon (for 3rd order playback)
  • Minimum of 6 speakers arranged in a hexagon (for 2nd order playback)
  • Minimum of 4 speakers arranged in a square (for 1st order playback)
  • 5.1 or 7.1 decodes (3rd order using irregular optimised decoders [Wiggins, 2007])
  • Two channel stereo mix-down using the UHJ format (Gerzon, 1985)
  • Binaural output (with or without head-tracking, up to 3rd order resolution)

Ambisonics differs from many other formats in that the encoding of the sound field is separated from the decoding, or presentation, of that sound field to the listener. To encode an audio source in a particular direction, the audio signal is multiplied by the coefficient of each spherical harmonic channel at the required angle (although the panners graphical user interface shows a 2 dimensional plane, the plug-in does implement panning in three dimensions, allowing for future expansion). The GASP panner pre-dates Ambisonics being standardised for virtual reality, which uses the ambiX standard of channel ordering and normalisation (Nachbar et al, 2011) so uses the Furse-Malham normalisation and channel ordering scheme (Malham, 2003). Conversion to the newer ambiX scheme is simply a change of channel order and gain, and can be realised using free plug-ins (Wiggins, 2016).

The signal processing elements of the system are hosted in Reaper (Reaper.fm, 2019) DAW running on a Mac Pro. Reaper is utilised to manage routing and sequencing of the audio due to its flexibility in terms of channel count, routing possibilities and its support for multiple plug-in standards (Wiggins, 2008). Each track requires 16 channels for the 3rd order Ambisonic signals that are created by the spatialising panner as shown in Figure 4.

Figure 4 – A single string spatialiser with ‘stereo’ image shown as red and blue dots

3.3. Timbralisation

The intensive real-time timbral processing and spatialisation for each individual string is carried out in the Mac Pro computer, the sound processing (timbralising) software is Line 6 Helix Native (Line 6, 2018). Each string is processed individually by a Helix Native preset, this simulates a typical guitar rig to provide amp simulation and effects, creating the core timbre of the sound. In the early stages of the GASP project Native Instruments Guitar Rig (Native-instruments.com, 2019) was used but was later switched to Line 6 Helix Native which appears to be more computationally efficient with a greater variety and quality of effects. An instance of a Helix preset on each string produces a two-channel stereo output, as many patches have some form of a stereo effect on them such as reverbs, delays or chorus. It is worth noting that when applying amp distortion, having each string processed individually as opposed to processing the combined output, generates a more transparent timbre. This is a function of each individual string benefitting from the full bandwidth and dynamic range of the amp simulation processor. Any Helix presets can be used in the GASP system, although 28 presets have been identified and stored which facilitate sounds for a wide range of performance genres. These are currently categorised into five sets; the presets are selectable from the timbral presets clip bank within Ableton:

GASP Timbral Presets Bank:

  • Clean/Chorus x 4
  • Amp Crunch x 4
  • Distortion x 4
  • FX x 5
  • Extreme FX x 11

The stereo signals from Helix Native are sent to WigWare GASP Panners which position each stereo pair in 2D space using 1st, 2nd or 3rd order Ambisonics encoding. The variable parameters for the panners are Spread, Angle and Distance. Spread controls the angle between the two channels of the stereo Helix output signal so they can be positioned at either the same point in space or directly opposite each other or anywhere in between. Angle controls the lateral angle of the centre of the stereo image defined anticlockwise from the front. The distance parameter is slightly mislabelled and is more accurately related to the directivity index of the resulting decoded signal. At the maximum ‘distance’, the polar pattern on replay is as directional as it can be (limited by the Ambisonic order used). As the distance is reduced, the directivity becomes wider spreading the sound across more of the loudspeakers (essentially lowering the order continually from 3rd to 0th order, allowing for fractional orders in between). At a distance of zero, the sound comes out of all loudspeakers equally. Snapshots of the reproduced directional response of a single sound source can be seen in Figure 5. This variable directivity control is well suited to a music production context whereas true distance encoding, which is possible in Ambisonics using distance filtering and other psychoacoustic cues (Wiggins, 2009), would make mixing and production more difficult as the level, frequency response and other factors would change when altering this parameter. The encoding equations used to enable this feature can be seen in Table 1. The standard equations (Furse, 2014) are shown in column 3 and the altered equations to enable directivity panning used within GASP are shown in column 4. The standard encoding equations are based on simulating a plane wave and have been altered to allow for a distance control that affects the directivity of the audio source where each signal’s components are scaled by the distance raised to the order (dn). As we are panning only in the horizontal plane, currently, then z will always be 0 and x and y will be the co-ordinates of the source position on the panners graphical user interface (between -1 and 1). The full set of 3D 3rd order equations are shown for completeness.

This image has an empty alt attribute; its file name is Figure-5.png

Figure 5 – Decoded Directivity Pattern vs Distance Control

Table 1 – Ambisonic Encoding Equations with and without ‘distance’ parameter

When horizontal panning/spatialization is used, only the channels W, X, Y, U, V, P & Q are needed (when the z coordinate is 0, the other channels exhibit 0 or fixed gain). The gain verses angle for these channels are shown below in Figure 6. The Ambisonic decoder creates a directional response for each loudspeaker position using a linear combination of these signals.

This image has an empty alt attribute; its file name is Figure-6-1024x326.png

Figure 6 – Gain versus angle plots for 2D Ambisonic encoding equations

In order to easily select various dynamic variations of the Space, Angle and Distance spatialisers, several spatial preset clips have been created in Ableton

Pseudo Mono – All Strings together with minimal spread.

  • Static front
  • Rotating Clockwise
  • Stepping motion Anticlockwise
  • Stepping motion Anticlockwise, Wider Spread

Circle – All strings spaced approximately equally around the circle.

  • Static, Narrow spread
  • Static, Wide spread
  • Rotating Clockwise, Narrow Spread
  • Rotating Clockwise, Wide Spread
  • Rotating Clockwise, Spread oscillating
  • Rotating Clockwise, Spread oscillating, Distance oscillating
  • Strings 1,3,5 rotating clockwise, Strings 2,4,6 rotating anticlockwise, Narrow Spread
  • Strings 1,3,5 rotating clockwise, Strings 2,4,6 rotating anticlockwise, Wide Spread
  • 1,3,5 clockwise, 2,4,6 anticlockwise, Spread oscillating
  • 1,3,5 clockwise, 2,4,6 anticlockwise, Spread oscillating, Distance oscillating
  • Stepping clockwise, Wide Spread

Ping Pong – Strings crossing between opposite sides.

  • Instant Switch, All Strings Together
  • Instant Switch, String 1,3,5 opposite 2,4,6
  • Smooth motion over Strings 1,3,5 opposite 2,4,6
  • Smooth motion over 1 opposite 2, 3 opposite 4, 5 opposite 6, each pair offset by 45°

Figure 7 – Screenshot showing static spatialisation of all 6 strings: In performance mode, dynamic modulation of spatial parameters can be applied.

Each spatial preset can have its own speed of rotation, or speed of location switching, which is mapped to a continuous controller on a rocker pedal on the Behringer FCB1010 foot controller.

3.4. Arpeggiation

A further feature of the system is a set of individually MIDI triggered audio gates which facilitate programmable rhythmic muting and unmuting of individual strings. This has been named the ‘GuitArpeggiator’; as it produces an effect reminiscent of arpeggiators found on many synthesisers.

Figure 8 – ‘GuitArpeggiator’ screenshot showing the MIDI notes (G2 – C3) on/off switching of six Trackgates, each of which are assigned to individual strings.

The audio gates are triggered by a MIDI note sequence from an Ableton clip; a MIDI note filter on each channel is setup to filter the single note that relates to that string. This filtered note signal is sent to an instance of DMG TrackGate (Dmgaudio.com, 2019), which is set to receive a MIDI note on/off sidechain associated with each individual string, as shown in Figure 8. The individual string’s gate attack and release times can be adjusted; it should be noted there are optimal settings required to avoid harsh switching and audible glitching. The MIDI note-on message triggers the gate’s attack, the gate then stays open then until the note off message triggers the release phase. The arpeggiation effect can also be blended with the output from the other mono guitar pickups such that a sustained strummed chord can be output from the mono pickup output, whilst the MIDI sequence picks out and enhances the selected programmed rhythmic notes.

3.5. Ambisonic Send Effects

Two Ambisonic effects (i.e. effects that process Ambisonic B-format signals directly) are setup on bus sends that receive the signals after the spatialisation process has taken place; these are WigWare AmbiFreeVerb 2 and Blue Ripple 03A Spatial Delay.

AmbiFreeVerb 2 (Wiggins, 2016) provides a full 3D spatial reverb operating at 1st order. The reverb processing is novel as the processing in the spatial domain allows the reverb to be applied to the entire sound field, but processing different areas of the sound field separately, demonstrating good spatial and immersive properties that react to the spatial panning and information contained within the sound field. Although the reverb works at 1st order resolution, the 3rd order direct sound (supplied by the panners) provides the higher directional impression of the dry audio (Wiggins, 2016).

Blue Ripple spatial delay provides a delay effect for the whole 2D sound field with a rotation effect in the feedback loop so every repeat from the delay can come at a different angle. This has a particularly interesting effect in conjunction with some of the GASP spatialisation patterns (Blue Ripple, 2015).

3.6. System Control

For live use a control system is required which can be used to select presets in real-time either by the performing guitarist with a foot controller, or even a ‘live producer’ working in conjunction with the performer. Given the large number of parameters in each signal path, and when each of these is multiplied by seven (one for each isolated string and one for the mono output of the middle pickups), it is clear some form of intermediate macro control system is required. In order to unify the timbral, spatial and other parameters, a second computer (iMac) running Ableton Live (Ableton.com, 2019) has been programmed to function as the macro controller. Ableton clips, as shown in Figure 9, are implemented to facilitate the three core elements of GASP processing, i.e. spatialisers, timbralisers, and arpeggiation presets; collectively referred to as GASP Auditory Scenes.

Figure 9 – Ableton screenshot with Spatial, Timbral, and Arpeggiation preset clips

The clip structures in Ableton create looping sequences of MIDI continuous controller messages, the loops are effectively functioning as low frequency oscillators controlling the parameters of the spatialisers. Additionally, there are a bank of clips which contain program change messages to enable timbre selection and arpeggiation presets.

For the system to be controlled by a performing guitarist a Behringer FCB1010 MIDI pedalboard is employed (Behringer.com, 2019); this allows the performer to independently select timbre, spatial and arpeggiation preset clips which are stored in Ableton. The FCB board is configured to allow the guitarist to both select Ableton clips and operate the continuous controller pedals to enable variations in spatialisation. Each pedalboard footswitch is set-up to send out a unique MIDI message which can then be assigned to trigger any Ableton clip, including master clips. One of the expression pedals on the FCB foot controller is assigned to control the tempo of the Ableton session which is directly related to the spatialisation parameters. The spatialisers not only position audio to a given location, they are controllable via the tempo of Ableton’s sequencer, which is in turn mapped to continuous controller pedals of the FCB foot controller, allowing real-time user control. As the clip length is set in bars, this speeds up or slows down the rate of the spatialiser, thus facilitating interesting spatial rotation and/or fast locational switching effects of individual strings. It is worth noting, the FCB1010 is notoriously challenging to program, so firmware upgrades to UNO/FCB (Control Fcb1010.uno, 2019) have been installed, enabling improvements in this regard.

3.7. Spatial Effects And Control

of the Spread, Angle, Distance (SAD) spatialisers. The control of movement of the SAD spatialisers are currently all simultaneously directly mapped to Ableton’s tempo, such that as the tempo is increased, the parameter values for all SAD spatialiser parameters also increase pro-rata. The tempo control is mapped to the FCB foot controller allowing the performer to vary the SAD spatialisation values using FCB’s continuous controller (CC) rocker pedal. Ableton’s available tempo values range from 20 – 999 bpm, which are mapped to CC pedal values 0 – 127, which is a numerically linear mapping. Given this current configuration, some investigations into the performative aspects of real-time spatial manipulation have been carried out. Whilst the current linear mapping generates some interesting spatial effects, there are also some limitations for real-time live performance control. It is noted that around the mid-range of the tempo values there are some particularly interesting spatial effects which exist around a narrow band of tempo values. However, this narrow band is not well suited to the CC foot controller, as a finer resolution of rocker pedal movement is required. Work being currently undertaken should enable this to be investigated further; a linear to non-linear look-up table for continuous controller values has been created, which will facilitate the mid-range SAD values to effectively be expanded. This should allow a more nuanced investigation of the spatial effects at these mid-range tempo values. Example non-linear mappings are shown in Figure 10.

Figure 10 – Examples of transfer function curves for CC modulation of SAD parameters (future work), Courtesy S Thackery 2019

All SAD values are currently simultaneously affected by Ableton’s tempo setting, such that independent control of the individual Spread, Angle or Distance parameter movements with CC foot controller pedals are unavailable. This provides the next challenge for future work; to investigate ways in which to facilitate individual and independent control of SAD parameters. This would also enable additional features such as LFO and EG controllers for each SAD parameter to be a programmable or real-time performative feature.

4. Reflection of GASP Post-Productions

GASP productions of Elliot’s Joy, Prelude to Life (Baker, 2014) and Pale Aura were first presented at the Technical University of Hamburg, Germany, ‘Klingt Gut’ international conference June 2017. Cat Fantastic was first presented at University of Derby, UK, ‘Sounds in Space’ symposium, June 2019. Examples of productions, using YouTube spatial audio (optimised for headphone listening using 1st, rather than 3rd, order Ambsionics) can be found on the GASP YouTube channel https://tinyurl.com/GASPYouTube (Werner and Wiggins, 2019).

Some comments received from interested parties include:
Guitarist Roman Rappak (Miroshot. 2019) upon hearing of the project, commented

‘…it’s exciting to think of the kind of creative options this would open to guitarists, and as a live band that is focused so heavily on tech and the future, this could be exactly the kind of sound we are looking for…’

Music Producer David Ward, Executive Director of JAMES (Joint Audio Media Educational Support) (Jamesonline.org.uk. 2019)

‘…of particular interest, on our [accreditation] visit, was the Guitar based Ambisonic Spatial Performance project where we became aware of the myriad commercial, theatrical, performance and educational potentials of this project.’

Guitarist and Educator Fred T. Baker (En.wikipedia.org 2019)

‘The first time hearing my tracks back I was amazed by the quality of the spatial sound, it really had the WOW factor. I feel it has much potential for future development, it would be fantastic in a larger theatre. I think this system is unique, it gave my composition a whole new dimension.’

Elliot’s Joy: composed and performed by Fred T Baker, production by Jack Hooley and Dominic Dallali. This was the first of a series of GASP productions; the recording was made with our Yamaha APX400 electro-acoustic fitted with Ubertar hex pickup, and recorded directly into ProTools, with timbralisation subsequently applied using NI Guitar Rig and spatialisation production completed in Reaper using WigWare. Upon reflection, the ‘phasey’ guitar timbre is probably a little overdone, and there are some sections (e.g. bridge part) of the mix which seem to ‘pop-out’ level wise, a little too much. The lower strings as the bass part work well, although there is little spatial movement in the low frequency content.

Prelude to Life: composed and performed by Fred T Baker, production by Charlie Box and Duncan Werner. The recording was made with our Yamaha APX400 electro-acoustic, recorded directly into ProTools. Timbralisation is a mix of both NI Guitar Rig and instrument samples. Melodyne’s (Celemony.com, 2019) pitch to MIDI conversion was applied, the MIDI note events were then arranged to trigger various instrument samples in NI Komplete (Native-instruments.com, 2019). Upon reflection, the timing of pitch to MIDI conversion worked very well, such that the nuances of the guitar performance are precisely captured. The production was experimental, and again, upon reflection the choice of instrumentation could be refined, perhaps with a less diverse range of timbres, which may better suit the performance genre.

Pale Aura: performed by Dominic Dallali, production by Jack Hooley and Dominic Dallali. The recording was made with our Fender Stratocaster fitted with Ubertar hex pickups, recorded directly into ProTools, then timbralised with NI Guitar Rig. The spatialisation production was completed in Reaper using Wigware. This track is the guitar part of Pale Aura by the band Periphery, it is in the genre of progressive metal. It turned out to be quite a dramatic production, with rapid changes in location for close temporal events. The guitar part consists of some highly syncopated timing elements, which have been mapped to rapid location switching. There is a good range of amp distortion timbres employed for different parts of the performance. An unfortunate hiss is noticeable during both the intro and outro, which would also benefit from better creative spatialisation. A low kick drum was included to provide the listener with a sense of meter as the guitar performance includes several syncopated elements, and upon reflection the kick is an unwelcome distraction from the performance in places.

Cat Fantastic (CF): performed by Jack Hooley, production by Duncan Werner and Emma Fitzmaurice. The initial recording was made using our Fender Stratocaster fitted with Ubertar hex pickups. Two very different mixes of the track were made, both using Helix Native for timbralisation.

CF Mix 1 has a fixed timbre with dynamic spatialisation. Using the same timbre throughout, several spatial presets were applied during different sections of the arrangement, this enabled critical listening of the variations of the spatial production of the mix. However, upon reflection a greater sense of spatial modulation could have been achieved by implementing tempo (speed) variations in the SAD spatialisation.

CF Mix 2 applies post-production timbral morphing and dynamic spatialisation. Initial ideas for timbral morphing included some investigation into real-time continuous controller messages mapped to timbral parameters, e.g. chorus modulation values or reverberation. However, this proved problematic as the number of continuous controllers requiring real-time modulation was far greater than Helix would allow access. ‘Pmix plugin preset interpolator’ (Olilarkin.co.uk, 2019) was investigated; however this supports VST2 only, whereas Helix works with VST3, AAX or AU formats. Nevertheless, real-time timbral morphing remains an area for future investigation. However, in order to demonstrate proof of concept, multiple versions of each string, each with different timbres were printed on time-synchronized parallel tracks, thus allowing crossfading between individual string timbres, then mixing the respective tracks. The timbral morphing works well although future mixes using this technique would benefit from longer morphing time durations.

A general observation relating to GASP post-production is identified where the musical nature of differing guitar performance styles (an example could be acoustic picking vs thrash metal) can benefit from potentially extreme variations in timbral and spatial production, such that the more complex the performance technique, the greater the creative potential for the GASP production.

5. Critical Analysis and Future Work

5.1. System Rationalisation

Techniques for system rationalisation are sought, which will mean a rethink of the current configuration; considerations include combining sound processing actions (Reaper, Helix, Wigware, currently on Mac Pro) and system control (Ableton currently on iMac) into one application and be able to run on one computer. The most demanding data processing is Helix, so any rationalisation process would look here first.

5.2. Expansion to 3D

The current system operates in a 2D immersive environment; however, future work will look towards GASP productions which include height information for a deeper immersive experience. This then begs the questions, how might our current 2D guitar production techniques be expanded to 3D, and does this enhance the aesthetic immersive experience of a guitar performance further than a 2D experience? Further questions then arise, such as how does the stylistic performance technique of a specific genre of guitar performance map into 3D.

5.3. Live Performance Control

Given the current spatial parameters of Spread, Angle and Distance, leads us to consider enhancement possibilities for interface and control. Pysievficz and Weinzierl (2016) suggest a taxonomy of spatialisation with three categorisations; controller type/interface, control of spatial parameters, and scope of the control. The document provides consideration for further creative options which may be applied for GASP enhancement. One way we have considered how the GASP system may be controlled in a performance environment is through a dedicated ‘live GASP producer’ who would provide control over timbral, spatial and other performance realisation parameters, independently of the guitarist. This then leads to the idea of an interactive sound arts performance where the guitarist responds to changes initiated by the live producer, the outcome being potentially unpredictable depending on the imaginative response of the performing guitarist. Further, the live production control parameters could be made available to an audience who may also contribute to the performative outcome, resulting in an audience/guitarist interactive performance.

The live production work to date has focussed on spatial manipulation of individual strings, each with the same timbral processing. Future work will investigate Reaper Project templates to include individual and differing timbres across different strings, or probably more usefully across groups of strings, e.g. strings 1,2,3 with timbre A, strings 4,5,6 timbre B. This technique can then be expanded to include different spatial presets on individual, or other alternative grouping of strings.

Additionally, real-time timbral morphing remains a key area for future work. Investigations into accessing multiple parameters of timbral processing via CC messages continue to take place with both Helix and Guitar Rig.

5.4. Frequency Band Splitting

An alternative approach without the use of hex pickups is to take the standard mono guitar output and divide into a range of frequency bands, which can then be independently processed, either timbrally, spatially or both. To achieve this, a six-way crossover (frequency splitter) network was implemented on the main mono guitar input. As the system configuration is already optimised for six channels of guitar, the six frequency bands can be mapped to these channels. The output of each band of the crossover is then fed to the same processing channel setup as is used for the individual string processing system which allows for all the same patches to be used along with the same control system. The crossover points are chosen to have equal octave spacing and be within the range of frequencies that are produced by an electric guitar. Whilst this approach has been tested, we have not yet examined the spatial effects and possibilities in as much detail as individual string processing.

6. Acknowledgement

Since the inception of the GASP project, there have been both technical and musical contributions from staff and students at the University of Derby, so big thanks to: Charlie Middlicott, Sam Speakman, Mark Randell, Alex Wardle, Joe Callister, Tom Lawson, Tom Weightman, Dominic Dallali, Jack Hooley, Charlie Box, Emma Fitzmaurice, Beth Mansfield, Thomas Nash, Harry Dale, Steve Thackery, Emiliano Bonanomi (session guitarist), and other.

7. References

Ableton.com. (2019). Music production with Live and Push | Ableton. [online] Available at: https://www.ableton.com/en/ [Accessed 26 Nov. 2019].

Avid.com. (2019). Pro Tools – Music Software – Avid. [online] Available at: https://www.avid.com/pro-tools [Accessed 26 Nov. 2019].

Baker, F. (2018). Comment on the GASP System. [email].

Bates, E., (2009). The Composition and Performance of Spatial Music (Doctoral dissertation, Trinity College Dublin).

Bates, E., Furlong, D. and Dennehy, D., (2008), August. Adapting Polyphonic pickup Technology for Spatial Music Performance. In ICMC.

Behringer.com. (2019). |FCB1010|Behringer|P0089. [online] Available at: https://www.behringer.com/Categories/Behringer/Accessories/Midi-Foot Controllers/FCB1010/p/P0089 [Accessed 26 Nov. 2019].

Blue Ripple Sound. (2019). O3A Core. [online] Available at: http://www.blueripplesound.com/products/o3a-core [Accessed 26 Nov. 2019].

Celemony.com. (2019). Celemony. [online] Available at: https://www.celemony.com/en/start [Accessed 26 Nov. 2019].

Cycfi Research. (2019). Cycfi Research. [online] Available at: https://www.cycfi.com/ [Accessed 26 Nov. 2019].

D. Malham. (2003), “Higher order ambisonic systems,” abstracted from ’Space in Music – Music in Space’, an Mphil thesis by Dave Malham, submitted to the University of York in April 2003, checked: 2011-09-09, 2003. [Online]. Available: http://www.york.ac.uk/inst/mustech/3d_audio/higher_order_ambisonics.pdf

Dmgaudio.com. (2019). DMG Audio: Products : TrackGate. [online] Available at: https://dmgaudio.com/products_trackgate.php [Accessed 26 Nov. 2019].

En.wikipedia.org. (2019). Fred Thelonious Baker. [online] Available at: https://en.wikipedia.org/wiki/Fred_Thelonious_Baker [Accessed 28 Nov. 2019].

Facebook.com. (2019). klingt gut. [online] Available at: https://www.facebook.com/klingtGutHamburg [Accessed 26 Nov. 2019].

Fcb1010.uno. (2019). FCB/UnO Control Center for the Behringer FCB1010. [online] Available at: https://www.fcb1010.uno/ [Accessed 26 Nov. 2019].

Fender.com. (2020). Pau Ferro Guitars | Fender Guitars. [online] Available at: https://www.fender.com/articles/tech-talk/what-is-pau-ferro [Accessed 8 Mar. 2020].

Fender (2020). CITES Regulations For The Importation And Exportation Of Rosewood Effective January 2, 2017. [online] Available at: https://support.fender.com/hc/en-us/articles/115000867426-CITES-Regulations-For-The-Importation-And-Exportation-Of-Rosewood-Effective-January-2-2017 [Accessed 8 Mar. 2020].

Furse, R. (2014). HOA Technical Notes – B-Format. [online] Blue Ripple Sound. Available at: https://web.archive.org/web/20141020063947/http://www.blueripplesound.com/b-format [Accessed 28 Nov. 2019].

Gerzon, M. A. (1974a) Sound Reproduction Systems. Patent No. 1494751

Gerzon, M. A. (1974b) What’s wrong with Quadraphonics. Available at: http://audiosignal.co.uk/Resources/What_is_wrong_with_quadraphonics_A4.pdf [Accessed 26 May 2016]

Gerzon, Michael A. (November 1985). “Ambisonics in Multichannel Broadcasting and Video”. Journal of the Audio Engineering Society. AES. 33 (11): 859–871.

Graham, R. and Harding, J., 2015, June. SEPTAR: Audio Breakout Circuit for Multichannel Guitar. In Proc: NIME Conference Proceedings. Louisiana, USA. June 2015. (Vol. 2015, pp. 1-4). New Interfaces fFor Musical Expression: NIME.

Graham, R., Bridges, B., Manzione, C. and Brent, W., (2017), May. Exploring pitch and timbre through 3d spaces: embodied models in virtual reality as a basis for performance systems design. In NIME (pp. 157-162).

Jamesonline.org.uk. (2019). JAMES. [online] Available at: http://www.jamesonline.org.uk/ [Accessed 28 Nov. 2019].

Line 6 (2018). Helix Native. [Online] Available at: https://uk.line6.com/helix/helixnative.html

Miroshot. (2019). Miro Shot – AR/VR band/collective. [online] Available at: https://miroshot.com/ [Accessed 28 Nov. 2019].

Nachbar, C., Zotter, F., Deleflie, E. and Sontacchi, A. (2011). Ambix-a suggested ambisonics format. In Ambisonics Symposium, Lexington.

Native-instruments.com. (2019). GUITAR RIG 5 PRO. [online] Available at: https://www.native-instruments.com/en/products/komplete/guitar/guitar-rig-5-pro/ [Accessed 27 Nov. 2019].

Native-instruments.com. (2019). KOMPLETE 12. [online] Available at: https://www.native-instruments.com/en/products/komplete/bundles/komplete-12/ [Accessed 27 Nov. 2019].

Olilarkin.co.uk. (2019). audio software by oli larkin. [online] Available at: https://olilarkin.co.uk/index.php?p=pmix [Accessed 26 Nov. 2019].

Perez-Lopez, A., (2015). 3DJ: A supercollider framework for real-time sound spatialization. Georgia Institute of Technology.

Pysiewicz, A. and Weinzierl, S., (2017). Instruments for Spatial Sound Control in Real Time Music Performances. A Review. In Musical Instruments in the 21st Century (pp. 273-296). Springer, Singapore.

Randell, M. (2011). Report on a technological and aesthetic exploration of a multichannel guitar system. [online] Academia.edu. Available at: https://www.academia.edu/11735503/Report_on_a_technological_and_aesthetic_exploration_of_a_multichannel_guitar_system [Accessed 2 Dec. 2019].

Rappak, R. (2018). Audience of the Future : 3D Guitar Production System. [email].

Reaper.fm. (2019). REAPER | Audio Production Without Limits. [online] Available at: https://www.reaper.fm/index.php [Accessed 26 Nov. 2019].

Rubenstein, P. (2019). Ubertar Hexaphonic Guitar Pickups. [online] Ubertar.com. Available at: http://www.ubertar.com/hexaphonic/ [Accessed 25 Nov. 2019].

Sounds in Space. (2019). Sounds in Space. [online] Available at: http://soundsinspace.co.uk/ [Accessed 26 Nov. 2019].

Valiquet, P., (2012). The spatialisation of stereophony: Taking positions in post-war electroacoustic music. International Review of the Aesthetics and Sociology of Music, pp.403-421.

Ward, D. (2018). James Accreditation Visit. [email]

Werner, D. and Wiggins, B. (2019). GASP. [online] YouTube. Available at: https://tinyurl.com/GASPYouTube [Accessed 2 Dec. 2019].

Wiggins, B. (2007) The generation of panning laws for irregular speaker arrays using heuristic methods. Audio Engineering Society Conference: 31st International Conference: New Directions in High Resolution Audio. Audio Engineering Society.

Wiggins, B. (2008) Reproduced Sound 24 – Proceedings of the Institute of Acoustics, Vol 30. Pt 6

Wiggins, B. (2016). YouTube, Ambisonics and VR. [online] The Blog of Bruce. Available at: https://www.brucewiggins.co.uk/?p=666#more-666 [Accessed 26 Nov. 2019].

Wiggins, B. (2017) Analysis of Binaural Cue Matching using Ambisonics to Binaural Decoding Techniques. 4th International Conference on Spatial Audio, 7-10 Sept., Graz, Austria.

Wiggins, B. and Dring, M. (2016). AmbiFreeVerb 2—Development of a 3D Ambisonic Reverb with Spatial Warping and Variable Scattering. Audio Engineering Society Conference: 2016 AES International Conference on Sound Field Control. Audio Engineering Society.

Wiggins, B., Spenceley, T. (2009) Distance coding and performance of the mark 5 and st350 SoundField microphones and their suitability for Ambisonic reproduction. Reproduced Sound 25 – Proceeding of the Institute of Acoustics, Vol 31, Pt 4.


Fred T Baker (2014), [CD] Life Suite, First Hand Records. https://www.youtube.com/watch?v=hDJfv-Hft_s

Copyright 2012-2021 GASPTM Guitars with Ambisonic Spatial Performance