This poster represents work done by Undergraduate Research students: Charlie Box, Jack Hooley and Dominic Dallali, with supervision from Duncan Werner and Bruce Wiggins.

Project Aim

To investigate and further enhance the technical and musically creatie outcomes of the GASP project (Guitars with Ambisonic Spatial Performance).

Objectives

  • To generate high-quality multichannel guitar test recording to enable the complete GASP production cycle to be realised, i.e. multichannel guitar signal capture, noise and crosstalk editing, arrangement by parts, identification and extraction of musical motifs, timbralisation, and spatialisation. The final sound files can then be archived as ‘B-format Ambisonic’ for multi-speaker arrays, and as ‘Binaural’ for surround sound listening over headphones.
  • To produce suitable multichannel guitar test recording to enable ‘Pitch-to-MIDI’ data to be generated, enabling alternative electronically synthesised timbres in the ambisonic 3D sound stage, whilst retaining the human performance information. This utilises much of the above process, but includes wavefrom conversion to MIDI note performance data, which then facilitates the production of timbral alternatives using synthetic or sampled sounds, prior to surround sound spatialisation.
  • To develop and collect user feedback on the recently developed GASP-panner Graphical User Interface intended to facilitate live performance. The user feedback will be used to provide development information for further enhanced operations. See ‘Future Work’ below

Guitars and pickups

We currently have two guitars used for the GASP project, a Fender Stratocaster and a Yamaha APX acoustic; both have been retrofitted with multichannel pickups. The pickups are a bespoke design such that each guitar string gives its own individual audio output, therefore allowing each string to be individually processed; The pickups are manufactured and supplied by ‘Ubertar Hexaphonic Pickups’

Guitars with Ambisonic Spatial Performance (GASP)

GASP is an ongoing project where research into guitar performance utilising multiple individually processed string timbres, generated by our multichannel guitars, in conjunction with virtual guitar processing software, and processed ambisonically, provides scope for alternative guitar performance and production techniques.

Outcomes

  • The recording ‘Elliot Joy’, written and performed by Fred T Baker. The initial multichannel recording was made using our Yamaha acoustic guitar fitted with multichannel pickups enabling individual string recording and editing. Once recorded, the production process required critical listening of each string for the purpose of editing and replacement of ‘scuffed notes’, removal of crosstalk on unused strings, and identification of suitable waveform loop points to enable a new arrangement to be created. Critical listening identified several musical motifs which were copied and pasted into new tracks with a view to enhancing the production with additional, alternative timbralisation and localisation of the identified musical motifs. New timbres were generated using Guitar Rig sound processing software, the waveform data was then imported into Reaper for spatialisation using UoD bespoke Wigware panner plugins.
  • The recording of the guitar part from the track ‘Pale Aura’ by progressive metal band ‘Periphery’. Dominic Dallali (URSS student) made this recording using our multichannel Fender Stratocaster. We used a click-track to ensure timing stability which is a feature of this genre. A similar production technique to the above was employed. Due to the syncopated nature of the guitar performance a simple kick drum part was included part to provide a sense of metre.
  • The recording of ‘Prelude to Life, was written and performed by Fred T Baker. The techniques discussed above were used, however the emphasis was on experimenting with Melodyne’s ‘Pitch to MIDI’ conversion, and subsequently timbralising with pre-recorded sound samples. Spatialisation is created in similar ways to the above through the use of UoD Wigware plugins.
  • In April 2016 Google released spatial audio for its 360/Virtual Reality video platform on Youtube. This allows 360 immersive videos to have accompanying 3D audio that tracks with the user headphone (android only at present) allowing for a more immersive and realistic auditory experience for the user. The format chosen by Google is a subset of the Higher Order Ambisonic system used by the WigWare software plug-ins allowing us to create a mix compatible with the binaural, headphone, headtracked output of Youtube. For more details on the technical details and a demonstration of the technology behind Youtube Spatial Audio, please visit http://tinyurl.com/WigYouTubeTeardown.

Further work

To investigate the potential for guitar performance training using the multichannel guitar system. The detailed waveform information provided by individual string waveforms is useful for research to the analysis of performance timing information and also in identifying so called ‘scuffed’ notes, which might be unintentional mutes, fret buzz or other performance artefacts. From an educational/training perspective this carries value for guitar tutors and learners. To investigate the implementation of Beringher control surface which will enable real-time spatial mix control using hardware faders (as opposed to on screen controls). This will provide real-time simultaneous control over several spatial location parameters. Real-time guitar performance of GASP remains to be investigated, however our current system may need upgrading to provide enough processing power to run 12 versions of Guitar Rig simultaneously (two GRs timbres per string). To investigate the use of Ableton Live, utilising clip scene mode with prerecorded arrangement parts with individual strings, providing the user with a creative interface for the live performance possibilities of prerecorded GASP material.

Copyright 2012-2021 GASPTM Guitars with Ambisonic Spatial Performance