Technology and the History of the role Played in Extant’s Developing Practice
Sound Beam – PresenceIng Navigation through sound
As part of its inaugural research block ^The Stage Language Laboratory 1998 I included a workshop with Sound Beam; part of the collection of theatre styles and movement forms investigated by Extant research performers, Damien O’Connor, Liam O’Carroll and Liz Porter. I invited Guy Evans to work with us, drummer with the 1970’s prod-rock band ‘Van Der Graaf Generator’ and creator of outdoor sonic playground ‘Echo City’, and developer of a mobile music technology resource developed while at Shape London, where Guy and I had first met whilst both working there. The Extant group chose Sound Beam from the suite of possibilities that Guy introduced us to, which was a programmable beam of ultra sound that could span across a quadrant of space.
The aim of the days with Guy was to experiment with specific items of sound technology and find ways in which blind and partially-sighted performers might use them to extend the boundaries of live performance. Guy Used a multieffects unit to set up virtual ambiences and experimented with a Sampling Effects Unit operating at first as a conventional keyboard would. He describes ‘My aim at this stage was simply to convey the concept of an invisible virtual keyboard and give people some feel for how Sound beam responds, maintaining a clear link between cause and effect. I wasn’t quite prepared for the fascination that this simple exercise would cause. The key was the absence of any physical object. Someone (Maria?) explained that to blind and partially sighted people, any sound at close proximity triggers instinctive caution because that sound signals the presence of an object to be negotiated. To be able to create sounds without physical objects was itself a liberating experience.’ *3
Customized samples that used group vocals created a multi layered effect triggered by movement exercises causing a surreal conversational form through the cuing of phrases by the physical actions of people in the beam. We also had an operator move the beam around like an invisible torch while others tried to avoid detection. Also we placed the beam in a secret position, giving one person the task of working out its location. Damien said, “The sound beam workshop was inspiring. Playing round with a beam, trying to detect and avoid it was a lot of fun and just by having fun, it became apparent that this piece of equipment could make a vast difference in terms of accessing a given space.”
Liz said, “A moving sound beam could locate objects, i.e. lower sound if object further away, the nearer you get the higher the notes get.“
Liam said, “no amount of explanation or description prepared me for just how amazing sound beam is when you interact with it.
I think the beam can designate quadrants of sound that can act like a spot light, i.e. if particular sound is being activated it means others are out of the way, which could assist the blind performer or audience member” * 4
The outcomes were that in spite of it’s technical irregularities we found The invisible beam might act as a general guiding instrument, felt along as a direction of sound, and could be used to demark for the blind performer, parameters of performance space on stage. This system of spatial way-finding that uses sound as an actual physical object to navigate is different to all other methods employed by visually impaired people, as sound is more commonly used through feedback with echo location, through speech and audio instruction, or through audio spatial mapping. See studies *5
The Dark – A sonic Pathway to Navigate performative space
It was not until 2003 that another opportunity presented itself to reengage with performance and technology, and this came through Braunarts, a BAFTA award winning multi media company who hired me as writer on their Culture Online project called The Dark. My first encounter of immersion in the dark as part of a public event was when I attended ‘Dialogue in the Dark’ in 1996 at the South Bank Centre. I have described the origins and intention of this touring exhibition in chapter 1. Adam Alston states “The installation aims to develop understanding of blindness and visual impairment by immersing visitors in a series of pitch black installations based on spaces found in the host city.’ *6 . The impression that Dialogue in the Dark had on me back in 1996 was on the whole negative in that it left me with a sense that groups of both visually impaired and non-visually impaired people were taken on simulated journey through what was allegedly my experience of the world. Given white canes they were encouraged to move through synthesised ‘naturalistic’ facsimiles of for example , a road crossing, a park, and living room, in the pitch dark. The fact that some of these environments were outdoor ones relocated indoors, compromised the aim above to ‘develop understanding of blindness and visual impairment’ due to the fact that all sorts of other tangential clues offered in the ‘real world were missing. As blind rambler Randy Pierce describes ‘I can walk along and I can feel when I am close to a massive tree because the air pressures change, or I can feel when I’m close to a glacial erratic because of the temperature change.’ *7 Also a blind reviewer, Martin Buber comments ‘I also was reminded that being in the dark for an hour does not replicate what it is like to be blind because we have had years to adjust and find techniques that work for us in our daily lives.’ *8 As well as this, to thrust gangs of non-visually impaired wanderers around in the dark, untrained to use white canes, resulted in generating high levels of anxiety, hilarity or fear that would interfere with any ‘understanding’. Buber reflects, ‘My first reaction was: this exhibit might be scaring people more than helping them.‘ * 8
For me as a blind person going through this experience, as well as doubting the validity of the installations aims, I felt a boredom and frustration at this huge simulation exercise which offered me nothing but prosaic bland environments to explore. However, later being involved with The Dark, the touring and online art event, which premiered at the Dana Centre at London’s Science Museum in 2004, over-turned this perception and opened up artistic prospects to me of what technology and dark installation could be.
‘We live in a society bombarded by images,” said Terry Braun, Director at Braunarts, “and we rely on these images every day to find our way around and understand the world. In The Dark your eyes will be of no use to you – instead you will need to rely on your ears and your imagination to find your way through a maze of digital ghosts and to unlock the mysteries of their lives.’ *9‘The installation took place in a completely dark empty space that groups of up to fifteen people could enter and experience simultaneously. Using a 20.4 surround sound system, visitors were immersed into an emotionally charged story set on a slave ship and were encouraged to follow the point of view of the characters by physically following the character’s voice in the darkness’ *10 . Inspired by the true story of Edward Rushton, I created overlapping story- worlds for the real characters of ‘Edward Rushton – a Liverpudlian crewman on slave ships, who went blind and became a notable abolitionist; John Newton – a slave ship captain who became an Anglican clergyman and an abolitionist following an epiphany at sea; Quamina (named ‘Kunle’ in The DARK) – a freed slave working as a crewman on ships who was taught to read and write by Edward Rushton.’ * 10. The lessons that The Dark taught me were invaluable, about building multi-perspective narratives, spatialisation of sound and the invitation to move in the dark. ‘the combination of being in a completely dark space, a highly immersive 3D sound environment and being engaged in a very emotional story – resulted in a wide range of responses in visitors. Some could ‘see’ the structure of the ship through the sonic architecture created by 3D sound, some hallucinated random visualisations, some laughed and some cried *10 Between 40%/50% of the 50,000 visitors to the Dark did move in the installation because as Producer Gabi Braun says, we actively encouraged moving in the introduction heard just before the experience began. We explained that different characters will talk at the same time and that the characters may move around so if the visitors wanted to follow the characters’ story, they would have to also move in the space. *11 However half did not move, and preferred to play safe, standing still, allowing snatches of voices to come to them through the surrounding filmic soundscape.
The Cast Party – Airtime Linguistic Navigation
Inspired by my contribution to The Dark and the challenge of navigating in obscured environments with an artistic interface, I next explored further the way in which visually impaired people could be supported to move through large environments. A year before Steve Job launched Apple’s first iPhone which eleven years later enabled the technology for Hans Jørgen to develop the app based navigation exchange that pairs sighted volunteers with visually impaired users ‘Be My Eyes’ * 18, and later Ira Explorer *19, Extant in 2006 carried out the first remote human to human research, enhanced by the technology of the time, called The Cast Party. It was supported by Artsadmin, Orange and The Great Eastern Hotel, and went on to win the ^2007 Arts and Business Diversity Award. The project was a first-ever real-world attempt to research how a social arts environment could be accessed to visually impaired people through using personal describers, mobile phone technology, description and navigation at a site-specific experimental event. The Cast Party aimed to make accessible the social space in between formal access provision at arts events, where the time before, during, or after such events can be as much part of the arts experience, but which remain inaccessible, as it is usually the art product, whether it be a film, play or visual art work that is the aspect in receipt of formal audio description. ‘During intervals, launches, in the bar after a show, we can often find ourselves not knowing who is around us, what interactions are taking place, what the layout of the space is etc, and without this information, our movement and communication can be impeded.’ *15. As Blok and Westerlaken state in their research of visual impairment access within arts venues, ‘individuals with a visual disability often have difficulties finding their way through an event. In most cases, they rely on others to navigate them through the building, which limits their freedom of movement’ *16
The objective of The Cast Party was not to create art content, but to simulate the in-between-time’ of an art event, by creating a social/networking gathering, and to use a discrete technical system to access each visually impaired person present at the gathering to their own live, remote commentary on what was happening around them. Then to test if this method of access increased levels of information for the visually impaired persons present and whether this enhanced their ability to move around more independently, identify people more easily or enable better interaction with people around them. Also to explore the process of interaction between describer and visually impaired person, and to begin a basic assessment for this form of navigational access. 15 visually impaired people were invited to take part in the project, along with 15 sighted people, all from a wide range of social backgrounds and occupations. A training day was held in advance of the main event with all 30 participants attending at the venue, the Great Eastern Hotel, which was part of Liverpool Street Station at the time. This partnership provided the project with internal balcony architecture, that offered the describers high surrounding vantage points above the main event that was to take place below on the hotel’s ground floor atrium. The training day, held two weeks in advance of the event, allowed the matching of visually impaired participants with their describer, group discussion about differences in negotiating social situations, describing the hotel environment from a non sighted and sighted perspective, and carrying out a series of detailed Navigating exercises that used language only. The former mobile phone Company, ‘Orange’, partnered the project by supplying all the mobile phones and airtime, and during the training day allowed us to set up the mobile phone equipment, and connectivity, which was used to test remote navigational guiding exercises between the describers up on balconies who communicated to the visually impaired partner in the space below. Some feedback from the session was:
“The exercise where I had to describe the room to my sighted guide was really interesting because all my references were connected to sound and not from light or colours, for example I was able to say how much more of an echo there was when you were on the balcony’s rather than being down on the main floor.“
“For me it is also about looking at how I deal with making real time descriptions of space, movement and appearance and at the types of thing that sighted people take for granted”
“I really enjoyed the mutual describing of the room. It helped to create communication and relationship. The directions exercises helped not only to build on relationship but also we found that a language was developing like a piece of dance.”
“I would have liked to speak more with the other partners about how language enabled them to use the space. We did touch upon this briefly, but I was intrigued to hear about people’s preferences for certain descriptions, or motions, and it would have been interesting to explore these ideas further.”
“There was a moment where I looked up from guiding my partner. I saw fifteen visually impaired people on the ground having their movements controlled by fifteen sighted people staring down at them from various heights, and it didn’t feel entirely good. In fact, it was a bit Orwellian.”
“One thing that I was left being worried about for the night was the acoustics of the place and the noise levels there might be from the crowd and I’m not sure that we’ve quite tested this to its full – but I guess we’ll just see”
Two weeks later, around 100 guests including the 30 participants from the training day, arrived for The Cast Party at the hotel. Invitations had only requested guests arrive wearing something distinctive so they could be tracked easily, and to be prepared to take part in a task that would require their detection, communication and observational skills. Everyone at the party was randomly given a line from one of 15 anonymous famous movies, and the guests then moved around, quoting their line to each other, in order to try and identify the other cast members from their particular film to complete teams of 6 or 7 per film. The aim of this game was to break the ice among the party guests, ensuring everyone who attended would need to call on their communication detection and description skills to complete the task. Also to keep a momentum and interactivity going, to afford the describers above a dynamic environment to describe to their visually impaired counter-part in the crowd on the party floor.^ The Cast Party trailer.
Ultimate findings from the 30 participant evaluations were mixed regarding the experience of giving and receiving this personal commentary and whether visual sense could act virtually by being mediated through language and hearing in this type of environment. Some responses were:
“It felt that the game increased the speed of interactions and therefore the pressure to find something in a loud unfamiliar group.”
“I felt that this didn’t do what it needed to, in that the social interactions that it was supposed to encourage were reduced to a quick check if that person had the same film and then moving on.”
“It is difficult at times to churn out useful descriptions in a rapidly changing environment, and this lead to a great deal of frustration during the party.”
“I think the party game was a good way to test out its limitations, as it was a very dynamic situation. It might be interesting to see how it fairs in more day to day situations like bars, or shops.”
“It made me laugh a lot. So it was enjoyable but not very functional. I really couldn’t hear the description very well. And to switch focus from my left ear to someone coming up to talk to me was difficult. Between me and my describer it was fine but add a room of people and I found it chaos.”
“The equipment although worked well did have its limitations in particular sound quality and distortion. Specifically during the party and whilst the MC was speaking there was an echo”
“If I suggested walky-talky radio as an alternative to mobile phones, would I be told that the former is “outdated technology”, like minidisk?”
“the headset and my apparent addressing of an imaginary person seemed todo what is usually the white stick’s job of weirding people out.”
“I thoroughly enjoyed the party game especially just opening my own envelope and having someone read the information on it remotely and not having to ask a participant to read it to me.”
“My high point was having the opportunity to walk around in an unfamiliar environment unaided by a white cane or sighted guide. My only frustration was sound quality of equipment.”
“If a confident describer and a high quality mobile phone system were combined I believe this method of navigation could work in any environment.”
“A quieter setting, maybe outside, would be a good next trial situation.”
“The possibilities are endless.”
“I really enjoyed the concept of the game, congratulations to whoever thought of it.”
“I will always remember the smiles when with my remote description Liam was able to turn to one of the female participants and say, “I like your green dress as an ice-breaker.” *20
In spite of the mixed results of the research outcomes, the partnerships formed, the positive reaction to the interactive game, and the general audaciousness of the experiment that had visually impaired concerns leading in such a large and high profile venue, in order to further an investigation concerning access and functioning in curated space, introduced a PresencIng of a disability aesthetic. The curiosity Paradox define this aesthetic in contrast to a standard access (eugenic legacy) as, ‘Access Art ‘ and describe that it ‘ has a specific flair that emerges based on who is present. It opens multiple ways of naming, defining, interpreting, translating, and creating across space and time.’ They go on to define it as, ‘Access Art describes the ways marginalized people and communities creatively grow resources, design accessibility, celebrate joy and resistance, out-manoeuvre supremacy culture, and dream worlds beyond the impossible.’ * 21 This certainly was the rebellious nature of The Cast Party project. Mitchell and Snyder have written extensively about the political importance of this type of occupation by the disabled body within public space ‘ aesthetic judgments about the built environment remain unquestioned when architects make the case against accessible designs on the grounds that access produces ugly buildings, despite the fact that those buildings called beautiful are fashioned to suppress the disabled body from public view. – Indeed, aesthetics may be the most effective means of bridging this gap, for in the absence of aesthetic representation, it is not clear that human beings would be able to imagine what political community is, let alone understand their place in it. *22 ’ The project afforded a confidence in this imagining which was a new accessible relationship with the interior space of a building, and this in turn generated a confidence and curiosity to explore this terrain further.
The Cast Party Report 2006
Haptics – The move to tactile navigation
In the same year of 2006, I was informed of a demonstration that had been given by Angela Geary of Camberwell School of Art on a device called the ‘Haptic Pen’. Haptics being the science of the tactile sense, this was a hand-held wand-like device that allowed a user to feel virtual objects through a force feedback mechanism built into the device. Visiting her at the School’s Haptic Lab, designed to support art conservation studies, Angie set up the pen, and through it I could discern the outline of virtual cubes and spheres hanging in mid air. As with Sound Beam I connected the similarity of technology simulating the presence of objects in space, but rather than through sound, this was through touch. Immediately I was drawn to investigating further what this could bring to performance with regards visual impairment and experiencing the presence and absence of objects with regards different modalities. As Birsel, Marques and Loots comment, ‘Artists have long been fascinated by and making use of technology as a medium. The cognitive, social, and cultural interplay between technology and society has fuelled the experimentation around new aesthetics and a symbolic language in art’. *23 To progress this, I was then taken by Angela to meet, Professor William Harwin of the Cybernetics department at Reading University, and his Phd student Adam Spiers, who had just developed something called the Haptic Torch. Described as ‘ The device, housed in a torch, detects the distance to objects, while a turning dial on which the user puts his thumb indicates the changing distance to an object.’ *24 Using ultrasound, much as Sound Beam uses, Adam had developed the Haptic torch for visually impaired assistive technology to detect objects for way finding, rather than enabling virtual haptic objects in space. He was very interested in collaborating on an Extant project that brought his skills together with the potential field of immersive performance, the both of us having just experienced Punchdrunk’s production of ‘Faust’. A few months later I was invited to work on Punchdrunk’s Mask of the Red Death in 2007 as access consultant, and this enabled me to learn about constructing theatrical worlds covering large-scale eclectic environments such as at the Battersea Arts Centre, and how visually impaired audiences might move through and participate in such an experience. This associated more with Adam’s form of way-finding haptics, and so Extant made an application to the welcome Trust to fund our first haptic performance exploration. It was turned down. So Next we approached Orange who had supported The Cast Party and pitched to their R/D lab, Which Led nowhere. Unperturbed, we applied to the new Technology Strategy Board for support as a sme and engineering partnership , Adam now being based at the Bristol Robotics’ Lab. After some issues recruiting an academic partner, we invited Janet Van Der Linden and Yvonne Rogers of the Open University, to join with us, and we finally secured initial funding to begin.