Dr Karen Wood and Genevieve Say make up the Stream Project where we have been working in collaboration with neuroscientist, Tony Steffert who is currently working on his PhD in EEG and sonification. We wanted to develop our first collaboration together to make a solo dance performance showing the dancer’s physiological activity by means of sound and animation/lighting of which she can then interact with. The aim of this project is to make the internal external to the audience through use of sound, animation and lighting.
Our main aim is for the audiences to see the internal activity of the dancer and how she will interact with her own physiological state. A small portable device with electrodes and other sensors are placed on the dancer to transmit real-time, physiological data to control the animation and sound. Previously, the dancer has then used the sound to manipulate the choreography.
So far, we have presented our work at 2 events and the work has shown the physiological parameters of electroencephalography (EEG), heart rate variability (HRV) and respiration, which have been displayed via a project screen of graphical images and sound.
Research and development: http://youtu.be/-5JloQPBr70
We are particularly interested in brainwave data obtained from neurofeedback; a therapy, which uses EEG data to reduce certain brainwave states in favour of others. Initially we wanted to work at converting EEG and HRV data into movement and sound and experiment with moving while recording EEG and HRV data in real-time. We would now like to experiment with different ways of how we can communicate with the audience and the challenges that are faced. The project team would like to explore ways of working with EEG and HRV data to produce a live and projected piece of dance performance, which could result in an immersive and interactive environment and/or a stage production. We now envisage looking at how we can connect the dancer’s EEG, HRV and respiratory data into a real-time feed using lighting or projection as sources.
We will no longer use sound as we had done previously. We have a composer, Dr Gavin Wayte, who is interested in composing a piece of music for us using the previous EEG, HRV and respiratory data that was recorded as stimulus for the composition. This would therefore mean that we would not have to focus so much on sound and lighting or animation/visual image will become the main element.
This is an original way of incorporating real-time feedback of the body’s physiology into performance. EEG and HRV have not been employed to create lighting to interact with dance movement before. There is originality in creatively looking at how EEG can be effected by and/or have an effect on an individual within performance. There is scope to convert the data into visual effects that will provide an innovative aesthetic to the performance. Ben will work with us to write code that converts the data into lighting effects.
This project is taking some of the cutting-edge neuroscience technology that will soon become ubiquitous for computing gaming. From the increasing curiosity in neuroscience, this project will integrate neuroscience with a creative artform.
Future funding will be sought from ACE and the long term plan is to apply for a Wellcome Trust Small Arts Grant. It is felt that the project needs further development before applying for this.
We are looking at putting on a performance of this work in collaboration with Manchester Science Festival. We are also looking at gallery spaces that might be interested in putting on this work.