Raw:motions is an ongoing collaborative art project by Andy Buru and the photographer Joakim Erixon Flodman. The project explore emotions evoked by being bound in ropes. A guiding principle is to create interesting images and performances without flirting with pornography or the Japanese heritage of shibari/kinbaku (Obviously this is impossible but we do our best).
The first goal of raw:motions is to create an photo exhibition at Wip Konsthall in Stockholm in November 2017. As a part of the exhibition we want to showcase a performance that in real-time present the emotions of people being bound. This is done by detecting changes in the brain-wave patterns using electroencephalogram (EEG) and transforming these waves to visuals and audio.
Therefore we are looking for support from a combination of visual and audio artists and technology thinkers. See below for more details of the solution architecture and the help need. Note that this is a passion project so we can not fund any help.
August 2017 Buy an EEG headset, connect it to a laptop and start sampling data from rope sessions.
September 2017 Find collaborators for auditive and visual effects, and look create a first prototype of the live data interface to export samples.
October 2017 Buy output for the auditive and visual effects, and make an first integration with the live data interface.
November 2017 Practice the performance for the exhibition at Wip Konsthall.
The solution architecture uses a wireless blue-tooth EEG headset to measure the brain activity. Basic emotions (like happy, sad, calm and angry) are guessed by measuring the level of brain activity and if it has a positive or negative connotation. The data is then fed into a live data interface were it can be consumed by presentation units. Presentation units can be things like a four channel audio mix, a light design and a visual projection. The goal is to give the audience a deeper experience of what is happening in the performance.
Live data interface
How the live data interface will work is still under construction. The requirements are for the data to be near real-time (around 100ms) and in a format that is easy for the units to pick-up. Probably the whole setup will initially run out one Ubuntu laptop using some IP-based streaming protocol to scale in the future.
Four channel audio
The audio experience is probably the biggest challenge in how to translate the raw data to an experience that is both understandable and interesting to an audience. A possible solution could be having four tracks, instruments or loops representing the four corners (active/positive, active/negative, passive/positive and passive/negative) of the diagram and then automatically mix between them. Since the project still is in a planning phase the rate of changes in the diagram is unknown. It would be cool to already from the start consider mixing it to a multichannel configuration.
The light design is probably pretty easy to have different colours representing the different corners of the diagram. Or maybe playing with from and back lighting.
A visual projection could be displayed on a screen behind the performance. But this is probably the least important as the rope performance itself will provide visual entertainment.
Video of a rope performance
Here is an example of the strong emotions that can be evoked by being bound.