This blog contextualises a project exploring sEMG biofeedback in performance and sound art. The project is supported by the a-n New Collaborations.

The author is Victoria Gray, a visual artist specializing in performance, and the project collaborators are, Oliver Larkin, a musician and software programmer and Alex Grey , system designer, algorithm designer, and patent author of the sensor technology MyoLink and co-founder of Somaxis, Palo Alto, California.


0 Comments

Text by Victoria Gray

Potential and Further Links

In this short time it has only been possible to scratch the surface in terms of the potential of the project. It is clear that developing the MyoLink software and adapting the MyoLink hardware to our specific installation specifications is more complex than first thought. However, enough formative knowledge has been gathered to imagine how we can move forward and potentially, with which new partners.

Despite the work being in process, it was interesting to consider who these potential partners might be. This gave a broad picture of the multiple context(s) and field(s) that a project like this might develop within. A meeting with Science City York was extremely productive, suggesting ways in which the organisation might support future funding bids for this project, in particular with The Wellcome Trust. Also, it was possible to view the 3Sixty, Immersive Demonstration Space, (part of The University of York, Ron Cooke Hub), as a potential lab for further experimentation and formative dissemination.

What is very clear is that the relationship between art, science and technology is ‘trending’ right now in both research and art contexts. The result of this interest is largely positive as we see a developing discourse between disciplines that no longer need to be considered in isolation. There is also an increased drive to fund these initiatives and so the economy within these sectors and areas of research especially is burgeoning. Personally however, I have also been suspicious of this ubiquity. My specific concern being that projects are compelled to steer towards areas of research where there is financial support, potentially jeopardising artistic integrity. This is of course cynical, however, important to bear in mind, especially as creativity risks instrumentality within larger policy driven motivations that often transpire as increased funds and increased projects but not necessarily better quality art work. It is important to be clear about the motivations for this research, particularly that it developed out of an entirely different set of concerns, more personal than economic.

On reflection, the project has suitable expertise and the potential for future partnerships with the organisations visited is strong. In order to prepare the project for the next phase, a list of relevant organizations in the UK was compiled. The intention is to develop the project through some of these channels in 2014 onwards.

Wellcome Trust, Arts Awards: http://www.wellcome.ac.uk/Funding/Public-engagement/Funding-schemes/Arts-Awards/index.htm

Arts and Humanities Research Council (AHRC): http://www.ahrc.ac.uk/Funding-Opportunities/Research-funding/Themes/Science-in-Culture/Pages/Current-funding-opportunities.aspx

NESTA: http://www.nesta.org.uk/about-us

Biotechnology and Biological Sciences Research Council (BBRC): http://www.bbsrc.ac.uk/home/home.aspx

Northern Arts and Science Network: http://northernartsandsciencenetwork.blogspot.co.uk/

UKRC: http://www.ukrc.org.uk/aboutus

Superposition: http://www.thesuperposition.org/

Theatre And Performance Research Association (TAPRA): http://tapra.org/groups/performance-and-new-technologies-working-group/

Fieldwork: http://fieldwork-blog.tumblr.com/

The Art Catalyst: http://www.artscatalyst.org/

Art & Science: http://www.artandscience.org.uk/

Art Science Collaborative (ASCUS): http://ascus.org.uk/

British Arts Festivals Association (bafa): http://www.artsfestivals.co.uk/

Foundation for Art and Creative Technology (FACT): http://www.fact.co.uk/


0 Comments

Text by Victoria Gray

Context Context

After the focussing on digital technology it felt important to return to the performance component of the project.

During this project I authored a paper titled, Sound Affects: The Sonification of Energetic Exchange in Performance*.

To offer ways in which this project speaks to conceptual and philosophical discourse in the field of performance studies, I have chosen to re-produce two short excerpts and reflections here. The excerpts are in non-linear order and in a deliberately fragmented form for the purposes of a blog, rather than an academic paper.

* Delivered at: TaPRA, Performance and New Technologies Working Group. The University of Glasgow & The Royal Conservatoire of Scotland, UK

……

Context 1:

Performance in the ‘post-aesthetic turn’

During the early 1990s there was a paradigm shift in contemporary European choreography embracing the work of La Ribot, Xavier le Roy, Mårten Spångberg, Jérome Bel, Boris Charmatz, Tino Sehgal and Eszter Salamon, for example. This movement, coined by Bojana Cvejić as ‘post-aesthetic’ (2010) posed a perceptual problem asking how to shift the perceptibility of movement from vision to kinesthetic sensibility. In the ‘post-aesthetic turn’ therefore, performance challenged the aesthetic dominance of the visual (Cvejić 2010). It did so through strategies of stillness, slowness and micro-movement. In the ‘post-aesthetic’ turn, attention is brought to somatic processes that are largely ‘performing’ underneath the skin and thereby thought to be ‘immaterial’, and yet, these processes have strong bodily affects.

Reflection: How can sEMG challenge the aesthetic dominance of the visual further, asking how to shift the perceptibility of movement from visual to auditory sensibility?

Reference:

Cvejic, B. (2010) The Politics of Problems, Dance & Politics Conference/Dance, Politics & Co-Immunity, Giessen, 11-14 November. [Internet]. Available from<http://www.thinking-resistance.de/> [Accessed on: June 14 2013].

Context 2:

The Transmission of Affect

As a further point of departure, this research project re-considers Teresa Brennan’s theories regarding the generation and ‘transmission of affect’ between bodies, as a process of electrical entrainment (2004). I began to develop the theory, that ‘electrical entrainment’ is the primary affective mechanism in my solo performance practice. During performance, my body’s kinetic and kinaesthetic effort produces electrical energy. In order for my skeletal muscles to contract, (an activity which may or may not be visible to an audience), an electrical signal is sent from the central nervous system which in turn, innervates the muscle fibres. This causes a series of electrophysiological processes to take place, in short, generating electrical potential. This electrical output can be picked up via the skin, through surface electromyography. In this biofeedback process, conductive electrodes act as a contact between the skin and a sensor. In turn, the sensor amplifies and outputs the raw sEMG data, converting it into digital information.

Reflection: How can sEMG render the arguably ‘immaterial’ process of electrical entrainment and affective transmission palpable for performer and audience via sonic means?

Reference:

Brennan, T. (2004) The Transmission of Affect. Ithaca: Cornell University Press.


0 Comments

Text by Victoria Gray

Technicality (Part 3)

Correspondence

Question: V and O to A

What is the transmission distance of the sensors? How many sensors can run at any one time?

Answer: A to V and O

I’m not sure that the transmission distance of BLE in this context will be sufficient for your purposes (transmitting data to a computer during live performance). Usually we don’t worry about transmission distance because we are streaming data to a phone that is being worn on the body. For development purposes this is no issue but I wonder about limitations for performances.

For example, lets define ‘the performance space’ as the area in which Victoria can move around while maintaining good data signal strength to the receiver. The performance space would be limited to the transmission distance of BLE to the receiver, which ideally is located in the middle of the performance space, though obviously that may not be practical in your case. Our transmission distance is something like 15 feet, which means a 30 foot radius circle, with responsiveness dropping away toward the edge of the circle. We don’t know for sure yet, because we are still finalizing the design and materials of the enclosure, which affects transmission strength.

We have purposely designed the firmware to try and minimize power burn through BLE, but it may be possible to increase transmission range if we sacrifice battery life / operating time. I can see another type of possible solution in which Victoria is wearing a thin iPod touch strapped on in some manner. An app running on that could stream data via WiFi to your Mac, enabling a much larger possible performance space. The major downside is having to wear an iPod touch on your body.

You can have unlimited numbers of sensors if it is a computer for example and this would depend on the software. The average is 8. Something to consider, even at this early stage, is how you can use different numbers of sensors in different positions to be able to do different things. One thing that strikes me is that you have the potential for information coming from up to ten sensors simultaneously, each of which will can transmit muscle data. We can keep that in mind when putting together a framework to accept the data, in particular if what we want to do is recognize gestures, movements, and combinations of movements.

….

Reflection on Q&A’s

After these initial technical questions it highlighted the criticality of this technical collaboration. The conversations also demonstrated how a collaboration can take place between York, UK and California, USA through online forums.


0 Comments

Text by Victoria Gray

Technicality (Part 2)

Correspondences

Question: V and O to A

How does the sensor sample and transmit data? Can you give us further information about the SDK (Software Development Kit)? Will this be compatible with OSX not just iOS?

Answer: A to V and O

The sensor communicates with the iPhone using integrated Bluetooth Low Energy (BLE). Nothing extra is needed, just a sensor and an iPhone. The SDK is for interfacing with our sensor via iOS (iPhone OS is a mobile operating system developed and distributed by Apple Inc). With some modifications, it can be adapted for OSX (Operating system that powers mac computers). This would necessitate having an iPod Touch / iPad / iPad mini / iPhone running software that would then just broadcast the filtered, processed data to a nearby computer via WiFi.

A sample iPhone app code will be released to you that is easy to use and iterate on, and also a sample code that will allow you to interface with the live sensor data on OSX. In short, the SDK will bundle a lot of pre-configured filters and algorithms so you can basically pull out a string of numbers from 1 – 100 (approximately) that reflect muscle tension.

2000 samples per second per sensor is the maximum transmission rate. Data can be sampled at higher rates (at least 20khz, perhaps). Realistically, for muscle activity, we need the 2ksps transmission only for the frequency filters we are going to be putting on the raw data, computer-side. To do a band-pass filter on 2khz data rate, we are limited to 1000 Hz (Nyquist) which is the upper limit of the frequency spectrum of sEMG signals. We could get away with less. A majority of the frequency content of interest for us, since this is not research, is probably below 500 Hz. So we only go 2x on Nyquist we could reduce transmission to 1ksps and still be able to do our band-pass filters.


0 Comments

Text by Victoria Gray

Technicality (Part 1)

The technical dimension of the project has challenged my lack of technological skills. As a performance artist, my techniques are techniques of the body and not ‘hardware’ and ‘software’. In order to address this, a formative meeting was set up with Dr Andy Hunt, Senior Teaching Fellow in Music Technology, Chair of Electronics Teaching Committee at the Department of Electronics, University of York. Andy was able to advise on some of the technical questions that were arising and most importantly, was able to put biofeedback technology in a wider cultural context by sharing a lecture he was preparing on emergent technologies.

As an artist I mainly work with my body and occasionally hand-made objects, therefore this was a jolt to my system and made me feel ‘other’ to the process. However, Dr Andy Hunt’s expertise was especially relevant to our project since his own research outputs are in Interactive Sonification (the science of displaying data as sound, with real-time user interaction), of which Hunt is known internationally for his pioneering work.

At this point, it was critical to allow Oliver and Alex to correspond over the technical aspects of the project, which from my perspective reads like a programming language that I did not always fully understand….Despite this, I opted to follow the threads and contribute to the discussions, making sure the eventual performative outcome was kept in the foreground. Acknowledging the skills of others means that placing trust in any collaborator is vital. In this way, diverse skills can be brought together for the mutual benefit of all artists involved.

In the following posts it seems appropriate to share a selection of correspondences between myself, Oliver and Alex which flesh out the technical dimensions of the project. Publishing them here as non-linear ‘raw data’ shows the networked dimension of this project and the extent to which the software language is (for me at least) highly nuanced.

Correspondence 1

Question: V and O to A

How are the sensors built? How are they attached to the body? What is their battery life? What is the API (Application Programming Interface)?

Answer: A to V and O

Each sensor consists of a central module and an accessory, such as adhesive wings, a clip and a band, etc. The module has an EXG sensor (sEMG, EKG, EEG), an accelerometer, memory, and a micro-controller with integrated BLE radio. sEMG is the muscle activity sensing modality in the MyoLink sensor.

The sensor runs on coin-cell batteries, rechargeable lithium ion 2032 coin cell or single-use 2032 form factor coin cell. It streams raw EXG data at roughly 2k sps and yields roughly four hours of continuous operation, with scaled lifespan as transmission rate decreases.

We currently use an API running on a pc (c#/.net) with an ANT dongle attached. This grabs the raw data coming out of the sensor and saves it. The PC app has a live raw data view graph as well.


0 Comments