Carolan provides an unusual example of how to connect a musical instrument to a wealth of online digital information – to a ‘digital footprint’ of videos, audio recordings, documentation and also blog posts (such as this one). However, there is a sense in which many everyday instruments already do much the same, even if not directly through a technology such as the Artcodes and Muzicodes used by Carolan.
How so? Well, Juan Martinez Avila – a PhD student at the Mixed Reality Lab – has been studying how guitarists use digital resources on the Internet. Juan recently presented a paper on this topic at the Computer-Human Interaction (or CHI) conference in Glasgow. You can read Juan’s paper here but in a nutshell, Juan recruited a bunch of skilled guitarists, observed how they use digital technologies and interviewed them. He describes how accomplished guitarists turn to digital resources to help them learn new material, from making recordings of musical sketches during rehearsals to analysing and practicing along to You Tube videos when learning covers.
He also identifies how they face three key challenges when doing this. First, guitarists need to control digital resources while also holding and playing their instruments. How can they select and scroll through videos or control digital recorders when their hands are busy trying to play their guitars? Second, it is tricky to recall and line up the right resources for a particular practice session, so that previous time is wasted rediscovering videos and recordings and getting them ready to use. This becomes even more difficult when several musicians are working together, for example during a band rehearsal.
Juan’s paper raises three important requirements for designing future guitar technologies.
- First, it should be easier for guitarists to control digital media while there are playing – an idea more formally called ‘unencumbered interaction’.
- Second, guitars should be able to summon up appropriate digital resources for each situation they find themselves in so that these are ‘ready to hand’ – an idea called ‘contextual interaction’.
- Third, is that different instruments should be able to talk to each other (and perhaps also to other equipment such as amplifiers and effects pedals) so as to figure out what information is needs to be recalled and shared among a group of musicians – an idea called ‘connected interaction’.
Carolan’s use of Artcodes and Muzicdes to connect to its digital footprint offers two ways in which future interactions with guitars might become more unencumbered and contextual and if we had more than one Carolan, perhaps even connected too. But there may be many others. What about the possibilities of voice control? Or gestural control? Or attaching new dials and sliders to the guitar itself? Or perhaps of new kinds of media control pedals? There appears to be a bewildering array of technical possibilities and it is not clear what is the best route forward.
The next stage of Juan’s work is to engage further with guitarists to prototype new ways of connecting instruments to their digital footprints and interacting with them ‘in the moment’ and so explore more deeply the challenges of unencumbered, contextual and connected interactions. We look forward to telling you what they figure out in due course …