Electric avenue

Jazz guitarist Jonathan Crossley is currently working towards a PhD at Wits University in Johannesburg, where he works as a lecturer in the music department.


Doctorates can get a bit stuffy, but Crossley, though a serene, cheerful sort, has a mad scientist just below the surface.

In lay terms, the question his research is asking is: “What do you get if you cross a guitar with a robot?”

One of the practical responses to that Crossley calls Technologically Enabled Performance Practice (TEPP), which sees the musician don a modified motion capture suit that responds digitally to movements by the person wearing it, as well as to signals from Crossley’s redesigned guitar.

What is produced is not music as we know it. Crossley nods, pulling out his phone and shows a picture of a score (yes, picture) on the screen, using that as an aid to explain the concept. “I have material that works on a timeline,” he says. “The music that results is much more accessible than I imagined it would be. There’s physical movement involved in creating the sounds, and it’s important to make sure that these strong audio results are understandable.

UNDERSTANDING. Jonathan Crossley, left, playing with drummer Jonathan Sweetman.

UNDERSTANDING. Jonathan Crossley, left, playing with
drummer Jonathan Sweetman.

“I love electronic music – I always have – but the concerts in that genre are boring. There’s one guy with a laptop and another guy with some random visuals. Live, TEPP is tactile. You see the person performing the task. I wanted to develop my guitar music, but to avoid having the electronics only as an addendum.”

Musical instruments that we take for granted – Les Paul guitars, for instance – would once have seemed like crazy, futuristic ideas. Is Crossley’s thinking too conceptual for TEPP to become common practice? “Up until the point recording technology developed, instruments were just being refined,” he says. “Then, once recording started, the ideal was to have a great musician playing a great piece. But that stunted music to some degree, as there was little room for experimentation.

“As things developed, technology became a commodity. So Les Paul guitars: millions were made, and everyone could own one.

“When I was touring, I was buying more and more effects pedals,” he says, “but I ended up on my knees, twiddling knobs to get the sounds I wanted.”

He pats his chest: “The bottons needed to be up here.”

ARTIFICIAL INTELLIGENCE. By combining human emotion with a machine's capabilities, Jonathan Crossley hopes to drive a whole new musical movement. Picture: www.sxc.hu.

ARTIFICIAL INTELLIGENCE. By combining human emotion with a machine’s capabilities, Jonathan Crossley hopes to drive a whole new musical movement. Picture: www.sxc.hu.

The first step was to eviscerate one of his jazz guitars, replacing its innards with rewired hardware boards and adding dials and controls to the outside of the instrument. “That wasn’t good enough, though,” he smiles, “because I still needed to stop playing to change the settings. So I started considering movements that would allow my hands to still be touching the strings.

I figured that extending an elbow could create a certain value, bending a wrist could create another and so on. I started experimenting with potentiometers [devices for measuring electric potential or voltage].

“Then I stumbled on a Gypsy Midi Controller suit, a thing that you wear that I think was originally used for early motion capture work in films. I think I have the last one and it’s very fragile … There are 54 controllers in the suit that map the data accumulated as I move. This works for now. I’m also looking at heat monitors, Google Glass and eye monitors to create further values to add to the equation.”

Read more on these topics

music

Access premium news and stories

Access to the top content, vouchers and other member only benefits